Nov 25 05:36:52 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 25 05:36:52 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 25 05:36:52 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 05:36:52 localhost kernel: BIOS-provided physical RAM map:
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 25 05:36:52 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000027fffffff] usable
Nov 25 05:36:52 localhost kernel: NX (Execute Disable) protection: active
Nov 25 05:36:52 localhost kernel: APIC: Static calls initialized
Nov 25 05:36:52 localhost kernel: SMBIOS 2.8 present.
Nov 25 05:36:52 localhost kernel: DMI: Red Hat OpenStack Compute/RHEL, BIOS 1.16.1-1.el9 04/01/2014
Nov 25 05:36:52 localhost kernel: Hypervisor detected: KVM
Nov 25 05:36:52 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 25 05:36:52 localhost kernel: kvm-clock: using sched offset of 2889644639 cycles
Nov 25 05:36:52 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 25 05:36:52 localhost kernel: tsc: Detected 2445.406 MHz processor
Nov 25 05:36:52 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 25 05:36:52 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 25 05:36:52 localhost kernel: last_pfn = 0x280000 max_arch_pfn = 0x400000000
Nov 25 05:36:52 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 25 05:36:52 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 25 05:36:52 localhost kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000
Nov 25 05:36:52 localhost kernel: found SMP MP-table at [mem 0x000f5b60-0x000f5b6f]
Nov 25 05:36:52 localhost kernel: Using GB pages for direct mapping
Nov 25 05:36:52 localhost kernel: RAMDISK: [mem 0x2ed25000-0x3368afff]
Nov 25 05:36:52 localhost kernel: ACPI: Early table checksum verification disabled
Nov 25 05:36:52 localhost kernel: ACPI: RSDP 0x00000000000F5B20 000014 (v00 BOCHS )
Nov 25 05:36:52 localhost kernel: ACPI: RSDT 0x000000007FFE35EB 000034 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 05:36:52 localhost kernel: ACPI: FACP 0x000000007FFE3403 0000F4 (v03 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 05:36:52 localhost kernel: ACPI: DSDT 0x000000007FFDFCC0 003743 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 05:36:52 localhost kernel: ACPI: FACS 0x000000007FFDFC80 000040
Nov 25 05:36:52 localhost kernel: ACPI: APIC 0x000000007FFE34F7 000090 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 05:36:52 localhost kernel: ACPI: MCFG 0x000000007FFE3587 00003C (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 05:36:52 localhost kernel: ACPI: WAET 0x000000007FFE35C3 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 05:36:52 localhost kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe3403-0x7ffe34f6]
Nov 25 05:36:52 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfcc0-0x7ffe3402]
Nov 25 05:36:52 localhost kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfc80-0x7ffdfcbf]
Nov 25 05:36:52 localhost kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe34f7-0x7ffe3586]
Nov 25 05:36:52 localhost kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe3587-0x7ffe35c2]
Nov 25 05:36:52 localhost kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe35c3-0x7ffe35ea]
Nov 25 05:36:52 localhost kernel: No NUMA configuration found
Nov 25 05:36:52 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000027fffffff]
Nov 25 05:36:52 localhost kernel: NODE_DATA(0) allocated [mem 0x27ffd5000-0x27fffffff]
Nov 25 05:36:52 localhost kernel: crashkernel reserved: 0x000000006f000000 - 0x000000007f000000 (256 MB)
Nov 25 05:36:52 localhost kernel: Zone ranges:
Nov 25 05:36:52 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 25 05:36:52 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 25 05:36:52 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000027fffffff]
Nov 25 05:36:52 localhost kernel:   Device   empty
Nov 25 05:36:52 localhost kernel: Movable zone start for each node
Nov 25 05:36:52 localhost kernel: Early memory node ranges
Nov 25 05:36:52 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 25 05:36:52 localhost kernel:   node   0: [mem 0x0000000000100000-0x000000007ffdafff]
Nov 25 05:36:52 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000027fffffff]
Nov 25 05:36:52 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000027fffffff]
Nov 25 05:36:52 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 25 05:36:52 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 25 05:36:52 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 25 05:36:52 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 25 05:36:52 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 25 05:36:52 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 25 05:36:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 25 05:36:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 25 05:36:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 25 05:36:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 25 05:36:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 25 05:36:52 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 25 05:36:52 localhost kernel: TSC deadline timer available
Nov 25 05:36:52 localhost kernel: CPU topo: Max. logical packages:   4
Nov 25 05:36:52 localhost kernel: CPU topo: Max. logical dies:       4
Nov 25 05:36:52 localhost kernel: CPU topo: Max. dies per package:   1
Nov 25 05:36:52 localhost kernel: CPU topo: Max. threads per core:   1
Nov 25 05:36:52 localhost kernel: CPU topo: Num. cores per package:     1
Nov 25 05:36:52 localhost kernel: CPU topo: Num. threads per package:   1
Nov 25 05:36:52 localhost kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs
Nov 25 05:36:52 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 25 05:36:52 localhost kernel: kvm-guest: KVM setup pv remote TLB flush
Nov 25 05:36:52 localhost kernel: kvm-guest: setup PV sched yield
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x7ffdb000-0x7fffffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x80000000-0xafffffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xb0000000-0xbfffffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfed1bfff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed1c000-0xfed1ffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed20000-0xfeffbfff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 25 05:36:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 25 05:36:52 localhost kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices
Nov 25 05:36:52 localhost kernel: Booting paravirtualized kernel on KVM
Nov 25 05:36:52 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 25 05:36:52 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1
Nov 25 05:36:52 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u524288
Nov 25 05:36:52 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u524288 alloc=1*2097152
Nov 25 05:36:52 localhost kernel: pcpu-alloc: [0] 0 1 2 3 
Nov 25 05:36:52 localhost kernel: kvm-guest: PV spinlocks enabled
Nov 25 05:36:52 localhost kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear)
Nov 25 05:36:52 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 05:36:52 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 25 05:36:52 localhost kernel: random: crng init done
Nov 25 05:36:52 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 25 05:36:52 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 25 05:36:52 localhost kernel: Fallback order for Node 0: 0 
Nov 25 05:36:52 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 25 05:36:52 localhost kernel: Policy zone: Normal
Nov 25 05:36:52 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 25 05:36:52 localhost kernel: software IO TLB: area num 4.
Nov 25 05:36:52 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1
Nov 25 05:36:52 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Nov 25 05:36:52 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 25 05:36:52 localhost kernel: Dynamic Preempt: voluntary
Nov 25 05:36:52 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 25 05:36:52 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 25 05:36:52 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=4.
Nov 25 05:36:52 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 25 05:36:52 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 25 05:36:52 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 25 05:36:52 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 25 05:36:52 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4
Nov 25 05:36:52 localhost kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Nov 25 05:36:52 localhost kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Nov 25 05:36:52 localhost kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Nov 25 05:36:52 localhost kernel: NR_IRQS: 524544, nr_irqs: 456, preallocated irqs: 16
Nov 25 05:36:52 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 25 05:36:52 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 25 05:36:52 localhost kernel: Console: colour VGA+ 80x25
Nov 25 05:36:52 localhost kernel: printk: console [ttyS0] enabled
Nov 25 05:36:52 localhost kernel: ACPI: Core revision 20230331
Nov 25 05:36:52 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 25 05:36:52 localhost kernel: x2apic enabled
Nov 25 05:36:52 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 25 05:36:52 localhost kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask()
Nov 25 05:36:52 localhost kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself()
Nov 25 05:36:52 localhost kernel: kvm-guest: setup PV IPIs
Nov 25 05:36:52 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 25 05:36:52 localhost kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406)
Nov 25 05:36:52 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 25 05:36:52 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 25 05:36:52 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 25 05:36:52 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 25 05:36:52 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 25 05:36:52 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 25 05:36:52 localhost kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls
Nov 25 05:36:52 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 25 05:36:52 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 25 05:36:52 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 25 05:36:52 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 25 05:36:52 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 25 05:36:52 localhost kernel: Transient Scheduler Attacks: Vulnerable: No microcode
Nov 25 05:36:52 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 25 05:36:52 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 25 05:36:52 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 25 05:36:52 localhost kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers'
Nov 25 05:36:52 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 25 05:36:52 localhost kernel: x86/fpu: xstate_offset[9]:  832, xstate_sizes[9]:    8
Nov 25 05:36:52 localhost kernel: x86/fpu: Enabled xstate features 0x207, context size is 840 bytes, using 'compacted' format.
Nov 25 05:36:52 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 25 05:36:52 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 25 05:36:52 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 25 05:36:52 localhost kernel: landlock: Up and running.
Nov 25 05:36:52 localhost kernel: Yama: becoming mindful.
Nov 25 05:36:52 localhost kernel: SELinux:  Initializing.
Nov 25 05:36:52 localhost kernel: LSM support for eBPF active
Nov 25 05:36:52 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 25 05:36:52 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 25 05:36:52 localhost kernel: smpboot: CPU0: AMD EPYC 7763 64-Core Processor (family: 0x19, model: 0x1, stepping: 0x1)
Nov 25 05:36:52 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 25 05:36:52 localhost kernel: ... version:                0
Nov 25 05:36:52 localhost kernel: ... bit width:              48
Nov 25 05:36:52 localhost kernel: ... generic registers:      6
Nov 25 05:36:52 localhost kernel: ... value mask:             0000ffffffffffff
Nov 25 05:36:52 localhost kernel: ... max period:             00007fffffffffff
Nov 25 05:36:52 localhost kernel: ... fixed-purpose events:   0
Nov 25 05:36:52 localhost kernel: ... event mask:             000000000000003f
Nov 25 05:36:52 localhost kernel: signal: max sigframe size: 3376
Nov 25 05:36:52 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 25 05:36:52 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 25 05:36:52 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 25 05:36:52 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 25 05:36:52 localhost kernel: .... node  #0, CPUs:      #1 #2 #3
Nov 25 05:36:52 localhost kernel: smp: Brought up 1 node, 4 CPUs
Nov 25 05:36:52 localhost kernel: smpboot: Total of 4 processors activated (19563.24 BogoMIPS)
Nov 25 05:36:52 localhost kernel: node 0 deferred pages initialised in 9ms
Nov 25 05:36:52 localhost kernel: Memory: 7778908K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 604512K reserved, 0K cma-reserved)
Nov 25 05:36:52 localhost kernel: devtmpfs: initialized
Nov 25 05:36:52 localhost kernel: x86/mm: Memory block size: 128MB
Nov 25 05:36:52 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 25 05:36:52 localhost kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear)
Nov 25 05:36:52 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 25 05:36:52 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 25 05:36:52 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 25 05:36:52 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 25 05:36:52 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 25 05:36:52 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 25 05:36:52 localhost kernel: audit: type=2000 audit(1764049011.500:1): state=initialized audit_enabled=0 res=1
Nov 25 05:36:52 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 25 05:36:52 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 25 05:36:52 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 25 05:36:52 localhost kernel: cpuidle: using governor menu
Nov 25 05:36:52 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 25 05:36:52 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff]
Nov 25 05:36:52 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry
Nov 25 05:36:52 localhost kernel: PCI: Using configuration type 1 for base access
Nov 25 05:36:52 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 25 05:36:52 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 25 05:36:52 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 25 05:36:52 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 25 05:36:52 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 25 05:36:52 localhost kernel: Demotion targets for Node 0: null
Nov 25 05:36:52 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 25 05:36:52 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 25 05:36:52 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 25 05:36:52 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 25 05:36:52 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 25 05:36:52 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 25 05:36:52 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 25 05:36:52 localhost kernel: ACPI: Interpreter enabled
Nov 25 05:36:52 localhost kernel: ACPI: PM: (supports S0 S5)
Nov 25 05:36:52 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 25 05:36:52 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 25 05:36:52 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 25 05:36:52 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 3F
Nov 25 05:36:52 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 25 05:36:52 localhost kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 25 05:36:52 localhost kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR DPC]
Nov 25 05:36:52 localhost kernel: acpi PNP0A08:00: _OSC: OS now controls [SHPCHotplug PME AER PCIeCapability]
Nov 25 05:36:52 localhost kernel: PCI host bridge to bus 0000:00
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x280000000-0xa7fffffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: BAR 0 [mem 0xf9800000-0xf9ffffff pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: BAR 2 [mem 0xfc200000-0xfc203fff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: BAR 0 [mem 0xfea1a000-0xfea1afff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: BAR 0 [mem 0xfea1b000-0xfea1bfff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: BAR 0 [mem 0xfea1c000-0xfea1cfff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: BAR 0 [mem 0xfea1d000-0xfea1dfff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: BAR 0 [mem 0xfea1e000-0xfea1efff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: BAR 0 [mem 0xfea1f000-0xfea1ffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: BAR 0 [mem 0xfea20000-0xfea20fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: BAR 0 [mem 0xfea21000-0xfea21fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:00:1f.0: quirk: [io  0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO
Nov 25 05:36:52 localhost kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:00:1f.2: BAR 4 [io  0xd040-0xd05f]
Nov 25 05:36:52 localhost kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea22000-0xfea22fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:00:1f.3: BAR 4 [io  0x0700-0x073f]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0: BAR 0 [mem 0xfc800000-0xfc8000ff 64bit]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:02: extended config space not accessible
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [1] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [2] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [3] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [4] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [5] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [6] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [7] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [8] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [9] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [10] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [11] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [12] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [13] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [14] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [15] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [16] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [17] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [18] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [19] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [20] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [21] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [22] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [23] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [24] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [25] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [26] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [27] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [28] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [29] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [30] registered
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [31] registered
Nov 25 05:36:52 localhost kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:02:01.0: BAR 4 [io  0xc000-0xc01f]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-2] registered
Nov 25 05:36:52 localhost kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe840000-0xfe840fff]
Nov 25 05:36:52 localhost kernel: pci 0000:03:00.0: BAR 4 [mem 0xfbe00000-0xfbe03fff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:03:00.0: ROM [mem 0xfe800000-0xfe83ffff pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-3] registered
Nov 25 05:36:52 localhost kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:04:00.0: BAR 1 [mem 0xfe600000-0xfe600fff]
Nov 25 05:36:52 localhost kernel: pci 0000:04:00.0: BAR 4 [mem 0xfbc00000-0xfbc03fff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-4] registered
Nov 25 05:36:52 localhost kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:05:00.0: BAR 4 [mem 0xfba00000-0xfba03fff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-5] registered
Nov 25 05:36:52 localhost kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint
Nov 25 05:36:52 localhost kernel: pci 0000:06:00.0: BAR 4 [mem 0xfb800000-0xfb803fff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-6] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-7] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-8] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-9] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-10] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-11] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-12] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-13] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-14] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-15] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-16] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Nov 25 05:36:52 localhost kernel: acpiphp: Slot [0-17] registered
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22
Nov 25 05:36:52 localhost kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23
Nov 25 05:36:52 localhost kernel: iommu: Default domain type: Translated
Nov 25 05:36:52 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 25 05:36:52 localhost kernel: SCSI subsystem initialized
Nov 25 05:36:52 localhost kernel: ACPI: bus type USB registered
Nov 25 05:36:52 localhost kernel: usbcore: registered new interface driver usbfs
Nov 25 05:36:52 localhost kernel: usbcore: registered new interface driver hub
Nov 25 05:36:52 localhost kernel: usbcore: registered new device driver usb
Nov 25 05:36:52 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 25 05:36:52 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 25 05:36:52 localhost kernel: PTP clock support registered
Nov 25 05:36:52 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 25 05:36:52 localhost kernel: NetLabel: Initializing
Nov 25 05:36:52 localhost kernel: NetLabel:  domain hash size = 128
Nov 25 05:36:52 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 25 05:36:52 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 25 05:36:52 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 25 05:36:52 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 25 05:36:52 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 25 05:36:52 localhost kernel: e820: reserve RAM buffer [mem 0x7ffdb000-0x7fffffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: vgaarb: bridge control possible
Nov 25 05:36:52 localhost kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 25 05:36:52 localhost kernel: vgaarb: loaded
Nov 25 05:36:52 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 25 05:36:52 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 25 05:36:52 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 25 05:36:52 localhost kernel: pnp: PnP ACPI init
Nov 25 05:36:52 localhost kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved
Nov 25 05:36:52 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 25 05:36:52 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 25 05:36:52 localhost kernel: NET: Registered PF_INET protocol family
Nov 25 05:36:52 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 25 05:36:52 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 25 05:36:52 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 25 05:36:52 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 25 05:36:52 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 25 05:36:52 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 25 05:36:52 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 25 05:36:52 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 25 05:36:52 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 25 05:36:52 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 25 05:36:52 localhost kernel: NET: Registered PF_XDP protocol family
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x0fff] to [bus 03] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: bridge window [io  0x1000-0x0fff] to [bus 04] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: bridge window [io  0x1000-0x0fff] to [bus 05] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: bridge window [io  0x1000-0x0fff] to [bus 06] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: bridge window [io  0x1000-0x0fff] to [bus 07] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: bridge window [io  0x1000-0x0fff] to [bus 08] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: bridge window [io  0x1000-0x0fff] to [bus 09] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: bridge window [io  0x1000-0x0fff] to [bus 0a] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: bridge window [io  0x1000-0x0fff] to [bus 0b] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: bridge window [io  0x1000-0x0fff] to [bus 0c] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: bridge window [io  0x1000-0x0fff] to [bus 0d] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: bridge window [io  0x1000-0x0fff] to [bus 0e] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: bridge window [io  0x1000-0x0fff] to [bus 0f] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: bridge window [io  0x1000-0x0fff] to [bus 10] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: bridge window [io  0x1000-0x0fff] to [bus 11] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x0fff] to [bus 12] add_size 1000
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x1fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: bridge window [io  0x2000-0x2fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: bridge window [io  0x3000-0x3fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: bridge window [io  0x4000-0x4fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: bridge window [io  0x5000-0x5fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: bridge window [io  0x6000-0x6fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: bridge window [io  0x7000-0x7fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: bridge window [io  0x8000-0x8fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: bridge window [io  0x9000-0x9fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: bridge window [io  0xa000-0xafff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: bridge window [io  0xb000-0xbfff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: bridge window [io  0xe000-0xefff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: bridge window [io  0xf000-0xffff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: can't assign; no space
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: failed to assign
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: can't assign; no space
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: failed to assign
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: can't assign; no space
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: failed to assign
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x1fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: bridge window [io  0x2000-0x2fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: bridge window [io  0x3000-0x3fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: bridge window [io  0x4000-0x4fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: bridge window [io  0x5000-0x5fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: bridge window [io  0x6000-0x6fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: bridge window [io  0x7000-0x7fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: bridge window [io  0x8000-0x8fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: bridge window [io  0x9000-0x9fff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: bridge window [io  0xa000-0xafff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: bridge window [io  0xb000-0xbfff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: bridge window [io  0xe000-0xefff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: bridge window [io  0xf000-0xffff]: assigned
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: can't assign; no space
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: failed to assign
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: can't assign; no space
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: failed to assign
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: can't assign; no space
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: failed to assign
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4:   bridge window [io  0xf000-0xffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5:   bridge window [io  0xe000-0xefff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6:   bridge window [io  0xb000-0xbfff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7:   bridge window [io  0xa000-0xafff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0:   bridge window [io  0x9000-0x9fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1:   bridge window [io  0x8000-0x8fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2:   bridge window [io  0x7000-0x7fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3:   bridge window [io  0x6000-0x6fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4:   bridge window [io  0x5000-0x5fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5:   bridge window [io  0x4000-0x4fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6:   bridge window [io  0x3000-0x3fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7:   bridge window [io  0x2000-0x2fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0:   bridge window [io  0x1000-0x1fff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Nov 25 05:36:52 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:00: resource 9 [mem 0x280000000-0xa7fffffff window]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:01: resource 0 [io  0xc000-0xcfff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:01: resource 1 [mem 0xfc600000-0xfc9fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:01: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:02: resource 0 [io  0xc000-0xcfff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:02: resource 1 [mem 0xfc600000-0xfc7fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:02: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:03: resource 2 [mem 0xfbe00000-0xfbffffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:04: resource 2 [mem 0xfbc00000-0xfbdfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:05: resource 2 [mem 0xfba00000-0xfbbfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:06: resource 0 [io  0xf000-0xffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:06: resource 2 [mem 0xfb800000-0xfb9fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:07: resource 0 [io  0xe000-0xefff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:07: resource 2 [mem 0xfb600000-0xfb7fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:08: resource 0 [io  0xb000-0xbfff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:08: resource 2 [mem 0xfb400000-0xfb5fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:09: resource 0 [io  0xa000-0xafff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:09: resource 2 [mem 0xfb200000-0xfb3fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0a: resource 0 [io  0x9000-0x9fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0a: resource 1 [mem 0xfda00000-0xfdbfffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0a: resource 2 [mem 0xfb000000-0xfb1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0b: resource 0 [io  0x8000-0x8fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0b: resource 1 [mem 0xfd800000-0xfd9fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0b: resource 2 [mem 0xfae00000-0xfaffffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0c: resource 0 [io  0x7000-0x7fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0c: resource 1 [mem 0xfd600000-0xfd7fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0c: resource 2 [mem 0xfac00000-0xfadfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0d: resource 0 [io  0x6000-0x6fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0d: resource 1 [mem 0xfd400000-0xfd5fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0d: resource 2 [mem 0xfaa00000-0xfabfffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0e: resource 0 [io  0x5000-0x5fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0e: resource 1 [mem 0xfd200000-0xfd3fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0e: resource 2 [mem 0xfa800000-0xfa9fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0f: resource 0 [io  0x4000-0x4fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0f: resource 1 [mem 0xfd000000-0xfd1fffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:0f: resource 2 [mem 0xfa600000-0xfa7fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:10: resource 0 [io  0x3000-0x3fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:10: resource 1 [mem 0xfce00000-0xfcffffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:10: resource 2 [mem 0xfa400000-0xfa5fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:11: resource 0 [io  0x2000-0x2fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:11: resource 1 [mem 0xfcc00000-0xfcdfffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:11: resource 2 [mem 0xfa200000-0xfa3fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:12: resource 0 [io  0x1000-0x1fff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:12: resource 1 [mem 0xfca00000-0xfcbfffff]
Nov 25 05:36:52 localhost kernel: pci_bus 0000:12: resource 2 [mem 0xfa000000-0xfa1fffff 64bit pref]
Nov 25 05:36:52 localhost kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22
Nov 25 05:36:52 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 25 05:36:52 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 25 05:36:52 localhost kernel: software IO TLB: mapped [mem 0x000000006b000000-0x000000006f000000] (64MB)
Nov 25 05:36:52 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 25 05:36:52 localhost kernel: ACPI: bus type thunderbolt registered
Nov 25 05:36:52 localhost kernel: Initialise system trusted keyrings
Nov 25 05:36:52 localhost kernel: Key type blacklist registered
Nov 25 05:36:52 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 25 05:36:52 localhost kernel: zbud: loaded
Nov 25 05:36:52 localhost kernel: integrity: Platform Keyring initialized
Nov 25 05:36:52 localhost kernel: integrity: Machine keyring initialized
Nov 25 05:36:52 localhost kernel: Freeing initrd memory: 75160K
Nov 25 05:36:52 localhost kernel: NET: Registered PF_ALG protocol family
Nov 25 05:36:52 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 25 05:36:52 localhost kernel: Key type asymmetric registered
Nov 25 05:36:52 localhost kernel: Asymmetric key parser 'x509' registered
Nov 25 05:36:52 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 25 05:36:52 localhost kernel: io scheduler mq-deadline registered
Nov 25 05:36:52 localhost kernel: io scheduler kyber registered
Nov 25 05:36:52 localhost kernel: io scheduler bfq registered
Nov 25 05:36:52 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31
Nov 25 05:36:52 localhost kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39
Nov 25 05:36:52 localhost kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40
Nov 25 05:36:52 localhost kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40
Nov 25 05:36:52 localhost kernel: shpchp 0000:01:00.0: HPC vendor_id 1b36 device_id e ss_vid 0 ss_did 0
Nov 25 05:36:52 localhost kernel: shpchp 0000:01:00.0: pci_hp_register failed with error -16
Nov 25 05:36:52 localhost kernel: shpchp 0000:01:00.0: Slot initialization failed
Nov 25 05:36:52 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 25 05:36:52 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 25 05:36:52 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 25 05:36:52 localhost kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21
Nov 25 05:36:52 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 25 05:36:52 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 25 05:36:52 localhost kernel: Non-volatile memory driver v1.3
Nov 25 05:36:52 localhost kernel: rdac: device handler registered
Nov 25 05:36:52 localhost kernel: hp_sw: device handler registered
Nov 25 05:36:52 localhost kernel: emc: device handler registered
Nov 25 05:36:52 localhost kernel: alua: device handler registered
Nov 25 05:36:52 localhost kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller
Nov 25 05:36:52 localhost kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1
Nov 25 05:36:52 localhost kernel: uhci_hcd 0000:02:01.0: detected 2 ports
Nov 25 05:36:52 localhost kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x0000c000
Nov 25 05:36:52 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 25 05:36:52 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 25 05:36:52 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 25 05:36:52 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 25 05:36:52 localhost kernel: usb usb1: SerialNumber: 0000:02:01.0
Nov 25 05:36:52 localhost kernel: hub 1-0:1.0: USB hub found
Nov 25 05:36:52 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 25 05:36:52 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 25 05:36:52 localhost kernel: usbserial: USB Serial support registered for generic
Nov 25 05:36:52 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 25 05:36:52 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 25 05:36:52 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 25 05:36:52 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 25 05:36:52 localhost kernel: rtc_cmos 00:03: RTC can wake from S4
Nov 25 05:36:52 localhost kernel: rtc_cmos 00:03: registered as rtc0
Nov 25 05:36:52 localhost kernel: rtc_cmos 00:03: setting system clock to 2025-11-25T05:36:52 UTC (1764049012)
Nov 25 05:36:52 localhost kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram
Nov 25 05:36:52 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 25 05:36:52 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 25 05:36:52 localhost kernel: usbcore: registered new interface driver usbhid
Nov 25 05:36:52 localhost kernel: usbhid: USB HID core driver
Nov 25 05:36:52 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 25 05:36:52 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 25 05:36:52 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 25 05:36:52 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 25 05:36:52 localhost kernel: Initializing XFRM netlink socket
Nov 25 05:36:52 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 25 05:36:52 localhost kernel: Segment Routing with IPv6
Nov 25 05:36:52 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 25 05:36:52 localhost kernel: mpls_gso: MPLS GSO support
Nov 25 05:36:52 localhost kernel: IPI shorthand broadcast: enabled
Nov 25 05:36:52 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 25 05:36:52 localhost kernel: AES CTR mode by8 optimization enabled
Nov 25 05:36:52 localhost kernel: sched_clock: Marking stable (971004122, 146815914)->(1333707127, -215887091)
Nov 25 05:36:52 localhost kernel: registered taskstats version 1
Nov 25 05:36:52 localhost kernel: Loading compiled-in X.509 certificates
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 25 05:36:52 localhost kernel: Demotion targets for Node 0: null
Nov 25 05:36:52 localhost kernel: page_owner is disabled
Nov 25 05:36:52 localhost kernel: Key type .fscrypt registered
Nov 25 05:36:52 localhost kernel: Key type fscrypt-provisioning registered
Nov 25 05:36:52 localhost kernel: Key type big_key registered
Nov 25 05:36:52 localhost kernel: Key type encrypted registered
Nov 25 05:36:52 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 25 05:36:52 localhost kernel: Loading compiled-in module X.509 certificates
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 25 05:36:52 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 25 05:36:52 localhost kernel: ima: No architecture policies found
Nov 25 05:36:52 localhost kernel: evm: Initialising EVM extended attributes:
Nov 25 05:36:52 localhost kernel: evm: security.selinux
Nov 25 05:36:52 localhost kernel: evm: security.SMACK64 (disabled)
Nov 25 05:36:52 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 25 05:36:52 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 25 05:36:52 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 25 05:36:52 localhost kernel: evm: security.apparmor (disabled)
Nov 25 05:36:52 localhost kernel: evm: security.ima
Nov 25 05:36:52 localhost kernel: evm: security.capability
Nov 25 05:36:52 localhost kernel: evm: HMAC attrs: 0x1
Nov 25 05:36:52 localhost kernel: Running certificate verification RSA selftest
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 25 05:36:52 localhost kernel: Running certificate verification ECDSA selftest
Nov 25 05:36:52 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 25 05:36:52 localhost kernel: clk: Disabling unused clocks
Nov 25 05:36:52 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 25 05:36:52 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 25 05:36:52 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 25 05:36:52 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 25 05:36:52 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 25 05:36:52 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 25 05:36:52 localhost kernel: Run /init as init process
Nov 25 05:36:52 localhost kernel:   with arguments:
Nov 25 05:36:52 localhost kernel:     /init
Nov 25 05:36:52 localhost kernel:   with environment:
Nov 25 05:36:52 localhost kernel:     HOME=/
Nov 25 05:36:52 localhost kernel:     TERM=linux
Nov 25 05:36:52 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Nov 25 05:36:52 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 25 05:36:52 localhost systemd[1]: Detected virtualization kvm.
Nov 25 05:36:52 localhost systemd[1]: Detected architecture x86-64.
Nov 25 05:36:52 localhost systemd[1]: Running in initrd.
Nov 25 05:36:52 localhost systemd[1]: No hostname configured, using default hostname.
Nov 25 05:36:52 localhost systemd[1]: Hostname set to <localhost>.
Nov 25 05:36:52 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 25 05:36:52 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 25 05:36:52 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 25 05:36:52 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 25 05:36:52 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 25 05:36:52 localhost systemd[1]: Reached target Local File Systems.
Nov 25 05:36:52 localhost systemd[1]: Reached target Path Units.
Nov 25 05:36:52 localhost systemd[1]: Reached target Slice Units.
Nov 25 05:36:52 localhost systemd[1]: Reached target Swaps.
Nov 25 05:36:52 localhost systemd[1]: Reached target Timer Units.
Nov 25 05:36:52 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 25 05:36:52 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 25 05:36:52 localhost systemd[1]: Listening on Journal Socket.
Nov 25 05:36:52 localhost systemd[1]: Listening on udev Control Socket.
Nov 25 05:36:52 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 25 05:36:52 localhost systemd[1]: Reached target Socket Units.
Nov 25 05:36:52 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 25 05:36:52 localhost systemd[1]: Starting Journal Service...
Nov 25 05:36:52 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 25 05:36:52 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 25 05:36:52 localhost systemd[1]: Starting Create System Users...
Nov 25 05:36:52 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 25 05:36:52 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 25 05:36:52 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 25 05:36:52 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 25 05:36:52 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:02.0:00.0:01.0-1
Nov 25 05:36:52 localhost systemd[1]: Starting Setup Virtual Console...
Nov 25 05:36:52 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 25 05:36:52 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 25 05:36:52 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0
Nov 25 05:36:52 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 25 05:36:52 localhost systemd[1]: Finished Create System Users.
Nov 25 05:36:52 localhost systemd-journald[279]: Journal started
Nov 25 05:36:52 localhost systemd-journald[279]: Runtime Journal (/run/log/journal/372a8332b89e4c47aaaf012dba1b43d0) is 8.0M, max 153.6M, 145.6M free.
Nov 25 05:36:52 localhost systemd-sysusers[283]: Creating group 'users' with GID 100.
Nov 25 05:36:52 localhost systemd-sysusers[283]: Creating group 'dbus' with GID 81.
Nov 25 05:36:52 localhost systemd-sysusers[283]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 25 05:36:52 localhost systemd[1]: Started Journal Service.
Nov 25 05:36:52 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 25 05:36:52 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 25 05:36:52 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 25 05:36:53 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 25 05:36:53 localhost systemd[1]: Finished Setup Virtual Console.
Nov 25 05:36:53 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 25 05:36:53 localhost systemd[1]: Starting dracut cmdline hook...
Nov 25 05:36:53 localhost dracut-cmdline[297]: dracut-9 dracut-057-102.git20250818.el9
Nov 25 05:36:53 localhost dracut-cmdline[297]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 05:36:53 localhost systemd[1]: Finished dracut cmdline hook.
Nov 25 05:36:53 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 25 05:36:53 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 25 05:36:53 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 25 05:36:53 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 25 05:36:53 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 25 05:36:53 localhost kernel: RPC: Registered udp transport module.
Nov 25 05:36:53 localhost kernel: RPC: Registered tcp transport module.
Nov 25 05:36:53 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 25 05:36:53 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 25 05:36:53 localhost rpc.statd[414]: Version 2.5.4 starting
Nov 25 05:36:53 localhost rpc.statd[414]: Initializing NSM state
Nov 25 05:36:53 localhost rpc.idmapd[419]: Setting log level to 0
Nov 25 05:36:53 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 25 05:36:53 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 25 05:36:53 localhost systemd-udevd[432]: Using default interface naming scheme 'rhel-9.0'.
Nov 25 05:36:53 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 25 05:36:53 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 25 05:36:53 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 25 05:36:53 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 25 05:36:53 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 25 05:36:53 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 25 05:36:53 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 25 05:36:53 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 05:36:53 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 25 05:36:53 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 25 05:36:53 localhost systemd[1]: Reached target Network.
Nov 25 05:36:53 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 25 05:36:53 localhost systemd[1]: Starting dracut initqueue hook...
Nov 25 05:36:53 localhost kernel: virtio_blk virtio2: 4/0/0 default/read/poll queues
Nov 25 05:36:53 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 25 05:36:53 localhost kernel:  vda: vda1
Nov 25 05:36:53 localhost systemd-udevd[450]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 05:36:53 localhost kernel: libata version 3.00 loaded.
Nov 25 05:36:53 localhost kernel: ahci 0000:00:1f.2: version 3.0
Nov 25 05:36:53 localhost kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16
Nov 25 05:36:53 localhost kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode
Nov 25 05:36:53 localhost kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f)
Nov 25 05:36:53 localhost kernel: ahci 0000:00:1f.2: flags: 64bit ncq only 
Nov 25 05:36:53 localhost kernel: scsi host0: ahci
Nov 25 05:36:53 localhost kernel: scsi host1: ahci
Nov 25 05:36:53 localhost systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 25 05:36:53 localhost systemd[1]: Reached target Initrd Root Device.
Nov 25 05:36:53 localhost kernel: scsi host2: ahci
Nov 25 05:36:53 localhost kernel: scsi host3: ahci
Nov 25 05:36:53 localhost kernel: scsi host4: ahci
Nov 25 05:36:53 localhost kernel: scsi host5: ahci
Nov 25 05:36:53 localhost kernel: ata1: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22100 irq 49 lpm-pol 0
Nov 25 05:36:53 localhost kernel: ata2: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22180 irq 49 lpm-pol 0
Nov 25 05:36:53 localhost kernel: ata3: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22200 irq 49 lpm-pol 0
Nov 25 05:36:53 localhost kernel: ata4: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22280 irq 49 lpm-pol 0
Nov 25 05:36:53 localhost kernel: ata5: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22300 irq 49 lpm-pol 0
Nov 25 05:36:53 localhost kernel: ata6: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22380 irq 49 lpm-pol 0
Nov 25 05:36:53 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 25 05:36:53 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 25 05:36:53 localhost systemd[1]: Reached target System Initialization.
Nov 25 05:36:53 localhost systemd[1]: Reached target Basic System.
Nov 25 05:36:53 localhost kernel: ata2: SATA link down (SStatus 0 SControl 300)
Nov 25 05:36:53 localhost kernel: ata3: SATA link down (SStatus 0 SControl 300)
Nov 25 05:36:53 localhost kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300)
Nov 25 05:36:53 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 25 05:36:53 localhost kernel: ata1.00: applying bridge limits
Nov 25 05:36:53 localhost kernel: ata1.00: configured for UDMA/100
Nov 25 05:36:53 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 25 05:36:53 localhost kernel: ata4: SATA link down (SStatus 0 SControl 300)
Nov 25 05:36:53 localhost kernel: ata5: SATA link down (SStatus 0 SControl 300)
Nov 25 05:36:53 localhost kernel: ata6: SATA link down (SStatus 0 SControl 300)
Nov 25 05:36:53 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 25 05:36:53 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 25 05:36:53 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 25 05:36:53 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 25 05:36:54 localhost systemd[1]: Finished dracut initqueue hook.
Nov 25 05:36:54 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 25 05:36:54 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 25 05:36:54 localhost systemd[1]: Reached target Remote File Systems.
Nov 25 05:36:54 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 25 05:36:54 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 25 05:36:54 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 25 05:36:54 localhost systemd-fsck[524]: /usr/sbin/fsck.xfs: XFS file system.
Nov 25 05:36:54 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 25 05:36:54 localhost systemd[1]: Mounting /sysroot...
Nov 25 05:36:54 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 25 05:36:54 localhost kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 25 05:36:54 localhost kernel: XFS (vda1): Ending clean mount
Nov 25 05:36:54 localhost systemd[1]: Mounted /sysroot.
Nov 25 05:36:54 localhost systemd[1]: Reached target Initrd Root File System.
Nov 25 05:36:54 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 25 05:36:54 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 25 05:36:54 localhost systemd[1]: Reached target Initrd File Systems.
Nov 25 05:36:54 localhost systemd[1]: Reached target Initrd Default Target.
Nov 25 05:36:54 localhost systemd[1]: Starting dracut mount hook...
Nov 25 05:36:54 localhost systemd[1]: Finished dracut mount hook.
Nov 25 05:36:54 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 25 05:36:54 localhost rpc.idmapd[419]: exiting on signal 15
Nov 25 05:36:54 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 25 05:36:54 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 25 05:36:54 localhost systemd[1]: Stopped target Network.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Timer Units.
Nov 25 05:36:54 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 25 05:36:54 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Basic System.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Path Units.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Remote File Systems.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Slice Units.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Socket Units.
Nov 25 05:36:54 localhost systemd[1]: Stopped target System Initialization.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Local File Systems.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Swaps.
Nov 25 05:36:54 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped dracut mount hook.
Nov 25 05:36:54 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 25 05:36:54 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 25 05:36:54 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 25 05:36:54 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 25 05:36:54 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 25 05:36:54 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 25 05:36:54 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 25 05:36:54 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 25 05:36:54 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 25 05:36:54 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 25 05:36:54 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 25 05:36:54 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 25 05:36:54 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Closed udev Control Socket.
Nov 25 05:36:54 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Closed udev Kernel Socket.
Nov 25 05:36:54 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 25 05:36:54 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 25 05:36:54 localhost systemd[1]: Starting Cleanup udev Database...
Nov 25 05:36:54 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 25 05:36:54 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 25 05:36:54 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Stopped Create System Users.
Nov 25 05:36:54 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 25 05:36:54 localhost systemd[1]: Finished Cleanup udev Database.
Nov 25 05:36:54 localhost systemd[1]: Reached target Switch Root.
Nov 25 05:36:54 localhost systemd[1]: Starting Switch Root...
Nov 25 05:36:54 localhost systemd[1]: Switching root.
Nov 25 05:36:54 localhost systemd-journald[279]: Received SIGTERM from PID 1 (systemd).
Nov 25 05:36:54 localhost systemd-journald[279]: Journal stopped
Nov 25 05:36:55 localhost kernel: audit: type=1404 audit(1764049014.774:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 25 05:36:55 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 05:36:55 localhost kernel: SELinux:  policy capability open_perms=1
Nov 25 05:36:55 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 05:36:55 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 25 05:36:55 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 05:36:55 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 05:36:55 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 05:36:55 localhost kernel: audit: type=1403 audit(1764049014.884:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 25 05:36:55 localhost systemd[1]: Successfully loaded SELinux policy in 111.929ms.
Nov 25 05:36:55 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.541ms.
Nov 25 05:36:55 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 25 05:36:55 localhost systemd[1]: Detected virtualization kvm.
Nov 25 05:36:55 localhost systemd[1]: Detected architecture x86-64.
Nov 25 05:36:55 localhost systemd-rc-local-generator[608]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 05:36:55 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Stopped Switch Root.
Nov 25 05:36:55 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 25 05:36:55 localhost systemd[1]: Created slice Slice /system/getty.
Nov 25 05:36:55 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 25 05:36:55 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 25 05:36:55 localhost systemd[1]: Created slice User and Session Slice.
Nov 25 05:36:55 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 25 05:36:55 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 25 05:36:55 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 25 05:36:55 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 25 05:36:55 localhost systemd[1]: Stopped target Switch Root.
Nov 25 05:36:55 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 25 05:36:55 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 25 05:36:55 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 25 05:36:55 localhost systemd[1]: Reached target Path Units.
Nov 25 05:36:55 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 25 05:36:55 localhost systemd[1]: Reached target Slice Units.
Nov 25 05:36:55 localhost systemd[1]: Reached target Swaps.
Nov 25 05:36:55 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 25 05:36:55 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 25 05:36:55 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 25 05:36:55 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 25 05:36:55 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 25 05:36:55 localhost systemd[1]: Listening on udev Control Socket.
Nov 25 05:36:55 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 25 05:36:55 localhost systemd[1]: Mounting Huge Pages File System...
Nov 25 05:36:55 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 25 05:36:55 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 25 05:36:55 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 25 05:36:55 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 05:36:55 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 25 05:36:55 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 25 05:36:55 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 25 05:36:55 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 25 05:36:55 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 25 05:36:55 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 25 05:36:55 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 25 05:36:55 localhost systemd[1]: Stopped Journal Service.
Nov 25 05:36:55 localhost systemd[1]: Starting Journal Service...
Nov 25 05:36:55 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 25 05:36:55 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 25 05:36:55 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 05:36:55 localhost kernel: fuse: init (API version 7.37)
Nov 25 05:36:55 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 25 05:36:55 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 25 05:36:55 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 25 05:36:55 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 25 05:36:55 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 25 05:36:55 localhost systemd[1]: Mounted Huge Pages File System.
Nov 25 05:36:55 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 25 05:36:55 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 25 05:36:55 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 25 05:36:55 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 25 05:36:55 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 25 05:36:55 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 25 05:36:55 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 25 05:36:55 localhost kernel: ACPI: bus type drm_connector registered
Nov 25 05:36:55 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 25 05:36:55 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 25 05:36:55 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 25 05:36:55 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 25 05:36:55 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 25 05:36:55 localhost systemd-journald[649]: Journal started
Nov 25 05:36:55 localhost systemd-journald[649]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 25 05:36:55 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 25 05:36:55 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Started Journal Service.
Nov 25 05:36:55 localhost systemd[1]: Mounting FUSE Control File System...
Nov 25 05:36:55 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 25 05:36:55 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 25 05:36:55 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 25 05:36:55 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 25 05:36:55 localhost systemd-journald[649]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 25 05:36:55 localhost systemd-journald[649]: Received client request to flush runtime journal.
Nov 25 05:36:55 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 25 05:36:55 localhost systemd[1]: Starting Create System Users...
Nov 25 05:36:55 localhost systemd[1]: Mounted FUSE Control File System.
Nov 25 05:36:55 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 25 05:36:55 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 25 05:36:55 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 25 05:36:55 localhost systemd[1]: Finished Create System Users.
Nov 25 05:36:55 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 25 05:36:55 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 25 05:36:55 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 25 05:36:55 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 25 05:36:55 localhost systemd[1]: Reached target Local File Systems.
Nov 25 05:36:55 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 25 05:36:55 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 25 05:36:55 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 25 05:36:55 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 25 05:36:55 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 25 05:36:55 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 25 05:36:55 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 25 05:36:55 localhost bootctl[666]: Couldn't find EFI system partition, skipping.
Nov 25 05:36:55 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 25 05:36:55 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 25 05:36:55 localhost systemd[1]: Starting Security Auditing Service...
Nov 25 05:36:55 localhost systemd[1]: Starting RPC Bind...
Nov 25 05:36:55 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 25 05:36:55 localhost auditd[672]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 25 05:36:55 localhost auditd[672]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 25 05:36:55 localhost systemd[1]: Started RPC Bind.
Nov 25 05:36:55 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 25 05:36:55 localhost augenrules[677]: /sbin/augenrules: No change
Nov 25 05:36:55 localhost augenrules[692]: No rules
Nov 25 05:36:55 localhost augenrules[692]: enabled 1
Nov 25 05:36:55 localhost augenrules[692]: failure 1
Nov 25 05:36:55 localhost augenrules[692]: pid 672
Nov 25 05:36:55 localhost augenrules[692]: rate_limit 0
Nov 25 05:36:55 localhost augenrules[692]: backlog_limit 8192
Nov 25 05:36:55 localhost augenrules[692]: lost 0
Nov 25 05:36:55 localhost augenrules[692]: backlog 0
Nov 25 05:36:55 localhost augenrules[692]: backlog_wait_time 60000
Nov 25 05:36:55 localhost augenrules[692]: backlog_wait_time_actual 0
Nov 25 05:36:55 localhost augenrules[692]: enabled 1
Nov 25 05:36:55 localhost augenrules[692]: failure 1
Nov 25 05:36:55 localhost augenrules[692]: pid 672
Nov 25 05:36:55 localhost augenrules[692]: rate_limit 0
Nov 25 05:36:55 localhost augenrules[692]: backlog_limit 8192
Nov 25 05:36:55 localhost augenrules[692]: lost 0
Nov 25 05:36:55 localhost augenrules[692]: backlog 0
Nov 25 05:36:55 localhost augenrules[692]: backlog_wait_time 60000
Nov 25 05:36:55 localhost augenrules[692]: backlog_wait_time_actual 0
Nov 25 05:36:55 localhost augenrules[692]: enabled 1
Nov 25 05:36:55 localhost augenrules[692]: failure 1
Nov 25 05:36:55 localhost augenrules[692]: pid 672
Nov 25 05:36:55 localhost augenrules[692]: rate_limit 0
Nov 25 05:36:55 localhost augenrules[692]: backlog_limit 8192
Nov 25 05:36:55 localhost augenrules[692]: lost 0
Nov 25 05:36:55 localhost augenrules[692]: backlog 0
Nov 25 05:36:55 localhost augenrules[692]: backlog_wait_time 60000
Nov 25 05:36:55 localhost augenrules[692]: backlog_wait_time_actual 0
Nov 25 05:36:55 localhost systemd[1]: Started Security Auditing Service.
Nov 25 05:36:55 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 25 05:36:55 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 25 05:36:55 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 25 05:36:55 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 25 05:36:55 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 25 05:36:55 localhost systemd[1]: Starting Update is Completed...
Nov 25 05:36:55 localhost systemd[1]: Finished Update is Completed.
Nov 25 05:36:55 localhost systemd-udevd[700]: Using default interface naming scheme 'rhel-9.0'.
Nov 25 05:36:55 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 25 05:36:55 localhost systemd[1]: Reached target System Initialization.
Nov 25 05:36:55 localhost systemd[1]: Started dnf makecache --timer.
Nov 25 05:36:55 localhost systemd[1]: Started Daily rotation of log files.
Nov 25 05:36:55 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 25 05:36:55 localhost systemd[1]: Reached target Timer Units.
Nov 25 05:36:55 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 25 05:36:55 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 25 05:36:55 localhost systemd[1]: Reached target Socket Units.
Nov 25 05:36:55 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 25 05:36:55 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 05:36:55 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 25 05:36:55 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 05:36:55 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 25 05:36:55 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 25 05:36:55 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 25 05:36:55 localhost systemd[1]: Reached target Basic System.
Nov 25 05:36:55 localhost dbus-broker-lau[713]: Ready
Nov 25 05:36:55 localhost systemd[1]: Starting NTP client/server...
Nov 25 05:36:55 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 25 05:36:55 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 25 05:36:55 localhost systemd-udevd[709]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 05:36:55 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 25 05:36:55 localhost systemd[1]: Started irqbalance daemon.
Nov 25 05:36:55 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 25 05:36:55 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 05:36:55 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 05:36:55 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 05:36:55 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 25 05:36:55 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 25 05:36:55 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 25 05:36:55 localhost systemd[1]: Starting User Login Management...
Nov 25 05:36:55 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 25 05:36:55 localhost chronyd[753]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 25 05:36:55 localhost chronyd[753]: Loaded 0 symmetric keys
Nov 25 05:36:55 localhost chronyd[753]: Using right/UTC timezone to obtain leap second data
Nov 25 05:36:55 localhost chronyd[753]: Loaded seccomp filter (level 2)
Nov 25 05:36:55 localhost systemd[1]: Started NTP client/server.
Nov 25 05:36:55 localhost systemd-logind[744]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 25 05:36:55 localhost systemd-logind[744]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 25 05:36:55 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 25 05:36:55 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 25 05:36:55 localhost systemd-logind[744]: New seat seat0.
Nov 25 05:36:55 localhost systemd[1]: Started User Login Management.
Nov 25 05:36:55 localhost kernel: lpc_ich 0000:00:1f.0: I/O space for GPIO uninitialized
Nov 25 05:36:56 localhost kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt
Nov 25 05:36:56 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 25 05:36:56 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 25 05:36:56 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 25 05:36:56 localhost kernel: iTCO_vendor_support: vendor-support=0
Nov 25 05:36:56 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: Found a ICH9 TCO device (Version=2, TCOBASE=0x0660)
Nov 25 05:36:56 localhost iptables.init[738]: iptables: Applying firewall rules: [  OK  ]
Nov 25 05:36:56 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 25 05:36:56 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: initialized. heartbeat=30 sec (nowayout=0)
Nov 25 05:36:56 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:01.0
Nov 25 05:36:56 localhost kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console
Nov 25 05:36:56 localhost kernel: Console: switching to colour dummy device 80x25
Nov 25 05:36:56 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 25 05:36:56 localhost kernel: [drm] features: -context_init
Nov 25 05:36:56 localhost kernel: [drm] number of scanouts: 1
Nov 25 05:36:56 localhost kernel: [drm] number of cap sets: 0
Nov 25 05:36:56 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0
Nov 25 05:36:56 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 25 05:36:56 localhost kernel: Console: switching to colour frame buffer device 160x50
Nov 25 05:36:56 localhost kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 25 05:36:56 localhost kernel: kvm_amd: TSC scaling supported
Nov 25 05:36:56 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 25 05:36:56 localhost kernel: kvm_amd: Nested Paging enabled
Nov 25 05:36:56 localhost kernel: kvm_amd: LBR virtualization supported
Nov 25 05:36:56 localhost kernel: kvm_amd: Virtual VMLOAD VMSAVE supported
Nov 25 05:36:56 localhost kernel: kvm_amd: Virtual GIF supported
Nov 25 05:36:56 localhost cloud-init[793]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 25 Nov 2025 05:36:56 +0000. Up 4.97 seconds.
Nov 25 05:36:56 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 25 05:36:56 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 25 05:36:56 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpwsfgs1x1.mount: Deactivated successfully.
Nov 25 05:36:56 localhost systemd[1]: Starting Hostname Service...
Nov 25 05:36:56 localhost systemd[1]: Started Hostname Service.
Nov 25 05:36:56 np0005534386 systemd-hostnamed[807]: Hostname set to <np0005534386> (static)
Nov 25 05:36:56 np0005534386 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 25 05:36:56 np0005534386 systemd[1]: Reached target Preparation for Network.
Nov 25 05:36:56 np0005534386 systemd[1]: Starting Network Manager...
Nov 25 05:36:56 np0005534386 NetworkManager[811]: <info>  [1764049016.9834] NetworkManager (version 1.54.1-1.el9) is starting... (boot:724c5288-952e-4df2-81c9-c4395c2e16fd)
Nov 25 05:36:56 np0005534386 NetworkManager[811]: <info>  [1764049016.9838] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Nov 25 05:36:56 np0005534386 NetworkManager[811]: <info>  [1764049016.9924] manager[0x5637ea0c1080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 05:36:56 np0005534386 NetworkManager[811]: <info>  [1764049016.9952] hostname: hostname: using hostnamed
Nov 25 05:36:56 np0005534386 NetworkManager[811]: <info>  [1764049016.9953] hostname: static hostname changed from (none) to "np0005534386"
Nov 25 05:36:56 np0005534386 NetworkManager[811]: <info>  [1764049016.9955] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0093] manager[0x5637ea0c1080]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0093] manager[0x5637ea0c1080]: rfkill: WWAN hardware radio set enabled
Nov 25 05:36:57 np0005534386 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0138] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0138] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0139] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0139] manager: Networking is enabled by state file
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0140] settings: Loaded settings plugin: keyfile (internal)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0155] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0175] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0187] dhcp: init: Using DHCP client 'internal'
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0189] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0199] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0207] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0212] device (lo): Activation: starting connection 'lo' (38cce539-3d4c-4266-b4bb-4a3c7b88c026)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0220] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0222] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0246] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0248] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0250] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0251] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0254] device (eth0): carrier: link connected
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0257] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0262] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0266] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0270] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0271] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0272] manager: NetworkManager state is now CONNECTING
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0274] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 05:36:57 np0005534386 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0279] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0287] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0291] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Nov 25 05:36:57 np0005534386 systemd[1]: Started Network Manager.
Nov 25 05:36:57 np0005534386 systemd[1]: Reached target Network.
Nov 25 05:36:57 np0005534386 systemd[1]: Starting Network Manager Wait Online...
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0356] dhcp4 (eth0): state changed new lease, address=192.168.26.115
Nov 25 05:36:57 np0005534386 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0379] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 05:36:57 np0005534386 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0445] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0446] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 05:36:57 np0005534386 NetworkManager[811]: <info>  [1764049017.0450] device (lo): Activation: successful, device activated.
Nov 25 05:36:57 np0005534386 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 25 05:36:57 np0005534386 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 05:36:57 np0005534386 systemd[1]: Reached target NFS client services.
Nov 25 05:36:57 np0005534386 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 25 05:36:57 np0005534386 systemd[1]: Reached target Remote File Systems.
Nov 25 05:36:57 np0005534386 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 05:36:58 np0005534386 NetworkManager[811]: <info>  [1764049018.9880] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:37:00 np0005534386 NetworkManager[811]: <info>  [1764049020.0490] dhcp6 (eth0): state changed new lease, address=2001:db8::331
Nov 25 05:37:01 np0005534386 NetworkManager[811]: <info>  [1764049021.3565] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 05:37:01 np0005534386 NetworkManager[811]: <info>  [1764049021.3589] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 05:37:01 np0005534386 NetworkManager[811]: <info>  [1764049021.3590] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 05:37:01 np0005534386 NetworkManager[811]: <info>  [1764049021.3592] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 05:37:01 np0005534386 NetworkManager[811]: <info>  [1764049021.3595] device (eth0): Activation: successful, device activated.
Nov 25 05:37:01 np0005534386 NetworkManager[811]: <info>  [1764049021.3599] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 05:37:01 np0005534386 NetworkManager[811]: <info>  [1764049021.3601] manager: startup complete
Nov 25 05:37:01 np0005534386 systemd[1]: Finished Network Manager Wait Online.
Nov 25 05:37:01 np0005534386 systemd[1]: Starting Cloud-init: Network Stage...
Nov 25 05:37:01 np0005534386 cloud-init[877]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 25 Nov 2025 05:37:01 +0000. Up 10.04 seconds.
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |  eth0  | True |        192.168.26.115        | 255.255.255.0 | global | fa:16:3e:99:5d:0b |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |  eth0  | True |      2001:db8::331/128       |       .       | global | fa:16:3e:99:5d:0b |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |  eth0  | True | fe80::f816:3eff:fe99:5d0b/64 |       .       |  link  | fa:16:3e:99:5d:0b |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: | Route |   Destination   |   Gateway    |     Genmask     | Interface | Flags |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   0   |     0.0.0.0     | 192.168.26.1 |     0.0.0.0     |    eth0   |   UG  |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   1   | 169.254.169.254 | 192.168.26.2 | 255.255.255.255 |    eth0   |  UGH  |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   2   |   192.168.26.0  |   0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: ++++++++++++++++++++++Route IPv6 info++++++++++++++++++++++
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +-------+---------------+-------------+-----------+-------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: | Route |  Destination  |   Gateway   | Interface | Flags |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +-------+---------------+-------------+-----------+-------+
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   1   |  2001:db8::1  |      ::     |    eth0   |   U   |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   2   | 2001:db8::331 |      ::     |    eth0   |   U   |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   3   |   fe80::/64   |      ::     |    eth0   |   U   |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   4   |      ::/0     | 2001:db8::1 |    eth0   |   UG  |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   6   |     local     |      ::     |    eth0   |   U   |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   7   |     local     |      ::     |    eth0   |   U   |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: |   8   |   multicast   |      ::     |    eth0   |   U   |
Nov 25 05:37:01 np0005534386 cloud-init[877]: ci-info: +-------+---------------+-------------+-----------+-------+
Nov 25 05:37:01 np0005534386 chronyd[753]: Selected source 73.185.182.209 (2.centos.pool.ntp.org)
Nov 25 05:37:01 np0005534386 chronyd[753]: System clock TAI offset set to 37 seconds
Nov 25 05:37:02 np0005534386 useradd[944]: new group: name=cloud-user, GID=1001
Nov 25 05:37:02 np0005534386 useradd[944]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 25 05:37:02 np0005534386 useradd[944]: add 'cloud-user' to group 'adm'
Nov 25 05:37:02 np0005534386 useradd[944]: add 'cloud-user' to group 'systemd-journal'
Nov 25 05:37:02 np0005534386 useradd[944]: add 'cloud-user' to shadow group 'adm'
Nov 25 05:37:02 np0005534386 useradd[944]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 25 05:37:02 np0005534386 cloud-init[877]: Generating public/private rsa key pair.
Nov 25 05:37:02 np0005534386 cloud-init[877]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 25 05:37:02 np0005534386 cloud-init[877]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 25 05:37:02 np0005534386 cloud-init[877]: The key fingerprint is:
Nov 25 05:37:02 np0005534386 cloud-init[877]: SHA256:H9HNTICTsasWC4yQfpDg4ejUtzsZhJI/404lx4m5i8E root@np0005534386
Nov 25 05:37:02 np0005534386 cloud-init[877]: The key's randomart image is:
Nov 25 05:37:02 np0005534386 cloud-init[877]: +---[RSA 3072]----+
Nov 25 05:37:02 np0005534386 cloud-init[877]: |..        .+...  |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |+.+o.     +o =   |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |.*=o o    o.. +  |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |o.oo*oo    o     |
Nov 25 05:37:02 np0005534386 cloud-init[877]: | ..BoBo S o      |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |. ..B +. = .     |
Nov 25 05:37:02 np0005534386 cloud-init[877]: | E + +  + .      |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |  = . ..         |
Nov 25 05:37:02 np0005534386 cloud-init[877]: | . o             |
Nov 25 05:37:02 np0005534386 cloud-init[877]: +----[SHA256]-----+
Nov 25 05:37:02 np0005534386 cloud-init[877]: Generating public/private ecdsa key pair.
Nov 25 05:37:02 np0005534386 cloud-init[877]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 25 05:37:02 np0005534386 cloud-init[877]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 25 05:37:02 np0005534386 cloud-init[877]: The key fingerprint is:
Nov 25 05:37:02 np0005534386 cloud-init[877]: SHA256:BC/9BMaqgcwQs5Y9t1FERc8/UGiGHCfPz/5Lp+RN4Nk root@np0005534386
Nov 25 05:37:02 np0005534386 cloud-init[877]: The key's randomart image is:
Nov 25 05:37:02 np0005534386 cloud-init[877]: +---[ECDSA 256]---+
Nov 25 05:37:02 np0005534386 cloud-init[877]: |o.    +=*=o...   |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |.oo   .=.+B+.    |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |.* + o..+ +*     |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |. + + +o o  =    |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |     +  S .  =.  |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |    .       ...+ |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |             .+.E|
Nov 25 05:37:02 np0005534386 cloud-init[877]: |             oo+.|
Nov 25 05:37:02 np0005534386 cloud-init[877]: |              ooo|
Nov 25 05:37:02 np0005534386 cloud-init[877]: +----[SHA256]-----+
Nov 25 05:37:02 np0005534386 cloud-init[877]: Generating public/private ed25519 key pair.
Nov 25 05:37:02 np0005534386 cloud-init[877]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 25 05:37:02 np0005534386 cloud-init[877]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 25 05:37:02 np0005534386 cloud-init[877]: The key fingerprint is:
Nov 25 05:37:02 np0005534386 cloud-init[877]: SHA256:1xsPPJtnQfUFPkAIaDbBMrIMsXbgqAg8BtEREHhLwH0 root@np0005534386
Nov 25 05:37:02 np0005534386 cloud-init[877]: The key's randomart image is:
Nov 25 05:37:02 np0005534386 cloud-init[877]: +--[ED25519 256]--+
Nov 25 05:37:02 np0005534386 cloud-init[877]: |XO+o ..o.. oo ..o|
Nov 25 05:37:02 np0005534386 cloud-init[877]: |Bo*.oE*   .  o .o|
Nov 25 05:37:02 np0005534386 cloud-init[877]: |+@ =.= .      + .|
Nov 25 05:37:02 np0005534386 cloud-init[877]: |* *        o . . |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |o       S . * .  |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |         .   O . |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |            + +  |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |             o   |
Nov 25 05:37:02 np0005534386 cloud-init[877]: |                 |
Nov 25 05:37:02 np0005534386 cloud-init[877]: +----[SHA256]-----+
Nov 25 05:37:02 np0005534386 systemd[1]: Finished Cloud-init: Network Stage.
Nov 25 05:37:02 np0005534386 systemd[1]: Reached target Cloud-config availability.
Nov 25 05:37:02 np0005534386 systemd[1]: Reached target Network is Online.
Nov 25 05:37:02 np0005534386 systemd[1]: Starting Cloud-init: Config Stage...
Nov 25 05:37:02 np0005534386 systemd[1]: Starting Crash recovery kernel arming...
Nov 25 05:37:02 np0005534386 systemd[1]: Starting Notify NFS peers of a restart...
Nov 25 05:37:02 np0005534386 systemd[1]: Starting System Logging Service...
Nov 25 05:37:02 np0005534386 sm-notify[960]: Version 2.5.4 starting
Nov 25 05:37:02 np0005534386 systemd[1]: Starting OpenSSH server daemon...
Nov 25 05:37:02 np0005534386 systemd[1]: Starting Permit User Sessions...
Nov 25 05:37:02 np0005534386 sshd[962]: Server listening on 0.0.0.0 port 22.
Nov 25 05:37:02 np0005534386 sshd[962]: Server listening on :: port 22.
Nov 25 05:37:02 np0005534386 systemd[1]: Started OpenSSH server daemon.
Nov 25 05:37:02 np0005534386 rsyslogd[961]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="961" x-info="https://www.rsyslog.com"] start
Nov 25 05:37:02 np0005534386 rsyslogd[961]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 25 05:37:02 np0005534386 systemd[1]: Started System Logging Service.
Nov 25 05:37:02 np0005534386 systemd[1]: Started Notify NFS peers of a restart.
Nov 25 05:37:02 np0005534386 systemd[1]: Finished Permit User Sessions.
Nov 25 05:37:02 np0005534386 systemd[1]: Started Command Scheduler.
Nov 25 05:37:02 np0005534386 systemd[1]: Started Getty on tty1.
Nov 25 05:37:02 np0005534386 systemd[1]: Started Serial Getty on ttyS0.
Nov 25 05:37:02 np0005534386 crond[970]: (CRON) STARTUP (1.5.7)
Nov 25 05:37:02 np0005534386 crond[970]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 25 05:37:02 np0005534386 crond[970]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 11% if used.)
Nov 25 05:37:02 np0005534386 crond[970]: (CRON) INFO (running with inotify support)
Nov 25 05:37:02 np0005534386 systemd[1]: Reached target Login Prompts.
Nov 25 05:37:02 np0005534386 sshd-session[964]: Connection reset by 192.168.26.11 port 47720 [preauth]
Nov 25 05:37:02 np0005534386 systemd[1]: Reached target Multi-User System.
Nov 25 05:37:02 np0005534386 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 25 05:37:02 np0005534386 sshd-session[972]: Unable to negotiate with 192.168.26.11 port 47722: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 25 05:37:02 np0005534386 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 25 05:37:02 np0005534386 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 25 05:37:02 np0005534386 sshd-session[984]: Unable to negotiate with 192.168.26.11 port 47738: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 25 05:37:02 np0005534386 sshd-session[988]: Unable to negotiate with 192.168.26.11 port 47742: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 25 05:37:02 np0005534386 sshd-session[1000]: Connection reset by 192.168.26.11 port 47756 [preauth]
Nov 25 05:37:02 np0005534386 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 05:37:02 np0005534386 sshd-session[1011]: Unable to negotiate with 192.168.26.11 port 47766: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 25 05:37:02 np0005534386 sshd-session[1015]: Unable to negotiate with 192.168.26.11 port 47774: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 25 05:37:02 np0005534386 sshd-session[977]: Connection closed by 192.168.26.11 port 47736 [preauth]
Nov 25 05:37:02 np0005534386 kdumpctl[979]: kdump: No kdump initial ramdisk found.
Nov 25 05:37:02 np0005534386 kdumpctl[979]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 25 05:37:02 np0005534386 sshd-session[993]: Connection closed by 192.168.26.11 port 47744 [preauth]
Nov 25 05:37:02 np0005534386 cloud-init[1107]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 25 Nov 2025 05:37:02 +0000. Up 11.38 seconds.
Nov 25 05:37:03 np0005534386 systemd[1]: Finished Cloud-init: Config Stage.
Nov 25 05:37:03 np0005534386 systemd[1]: Starting Cloud-init: Final Stage...
Nov 25 05:37:03 np0005534386 dracut[1239]: dracut-057-102.git20250818.el9
Nov 25 05:37:03 np0005534386 dracut[1241]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 25 05:37:03 np0005534386 cloud-init[1267]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 25 Nov 2025 05:37:03 +0000. Up 11.73 seconds.
Nov 25 05:37:03 np0005534386 cloud-init[1304]: #############################################################
Nov 25 05:37:03 np0005534386 cloud-init[1308]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 25 05:37:03 np0005534386 cloud-init[1314]: 256 SHA256:BC/9BMaqgcwQs5Y9t1FERc8/UGiGHCfPz/5Lp+RN4Nk root@np0005534386 (ECDSA)
Nov 25 05:37:03 np0005534386 cloud-init[1316]: 256 SHA256:1xsPPJtnQfUFPkAIaDbBMrIMsXbgqAg8BtEREHhLwH0 root@np0005534386 (ED25519)
Nov 25 05:37:03 np0005534386 cloud-init[1318]: 3072 SHA256:H9HNTICTsasWC4yQfpDg4ejUtzsZhJI/404lx4m5i8E root@np0005534386 (RSA)
Nov 25 05:37:03 np0005534386 cloud-init[1319]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 25 05:37:03 np0005534386 cloud-init[1320]: #############################################################
Nov 25 05:37:03 np0005534386 cloud-init[1267]: Cloud-init v. 24.4-7.el9 finished at Tue, 25 Nov 2025 05:37:03 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.86 seconds
Nov 25 05:37:03 np0005534386 systemd[1]: Finished Cloud-init: Final Stage.
Nov 25 05:37:03 np0005534386 systemd[1]: Reached target Cloud-init target.
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 25 05:37:03 np0005534386 dracut[1241]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 25 05:37:03 np0005534386 dracut[1241]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 25 05:37:03 np0005534386 dracut[1241]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 25 05:37:03 np0005534386 dracut[1241]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: memstrack is not available
Nov 25 05:37:04 np0005534386 dracut[1241]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 25 05:37:04 np0005534386 dracut[1241]: memstrack is not available
Nov 25 05:37:04 np0005534386 dracut[1241]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 25 05:37:04 np0005534386 dracut[1241]: *** Including module: systemd ***
Nov 25 05:37:04 np0005534386 dracut[1241]: *** Including module: fips ***
Nov 25 05:37:04 np0005534386 dracut[1241]: *** Including module: systemd-initrd ***
Nov 25 05:37:04 np0005534386 dracut[1241]: *** Including module: i18n ***
Nov 25 05:37:04 np0005534386 dracut[1241]: *** Including module: drm ***
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: prefixdevname ***
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: kernel-modules ***
Nov 25 05:37:05 np0005534386 kernel: block vda: the capability attribute has been deprecated.
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: kernel-modules-extra ***
Nov 25 05:37:05 np0005534386 dracut[1241]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 25 05:37:05 np0005534386 dracut[1241]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 25 05:37:05 np0005534386 dracut[1241]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 25 05:37:05 np0005534386 dracut[1241]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: qemu ***
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: fstab-sys ***
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: rootfs-block ***
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: terminfo ***
Nov 25 05:37:05 np0005534386 dracut[1241]: *** Including module: udev-rules ***
Nov 25 05:37:06 np0005534386 dracut[1241]: Skipping udev rule: 91-permissions.rules
Nov 25 05:37:06 np0005534386 dracut[1241]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: virtiofs ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: dracut-systemd ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: usrmount ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: base ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: fs-lib ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: kdumpbase ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 25 05:37:06 np0005534386 dracut[1241]:   microcode_ctl module: mangling fw_dir
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 25 05:37:06 np0005534386 irqbalance[742]: Cannot change IRQ 45 affinity: Operation not permitted
Nov 25 05:37:06 np0005534386 irqbalance[742]: IRQ 45 affinity is now unmanaged
Nov 25 05:37:06 np0005534386 irqbalance[742]: Cannot change IRQ 48 affinity: Operation not permitted
Nov 25 05:37:06 np0005534386 irqbalance[742]: IRQ 48 affinity is now unmanaged
Nov 25 05:37:06 np0005534386 irqbalance[742]: Cannot change IRQ 46 affinity: Operation not permitted
Nov 25 05:37:06 np0005534386 irqbalance[742]: IRQ 46 affinity is now unmanaged
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 25 05:37:06 np0005534386 dracut[1241]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: openssl ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: shutdown ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including module: squash ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Including modules done ***
Nov 25 05:37:06 np0005534386 dracut[1241]: *** Installing kernel module dependencies ***
Nov 25 05:37:07 np0005534386 dracut[1241]: *** Installing kernel module dependencies done ***
Nov 25 05:37:07 np0005534386 dracut[1241]: *** Resolving executable dependencies ***
Nov 25 05:37:08 np0005534386 dracut[1241]: *** Resolving executable dependencies done ***
Nov 25 05:37:08 np0005534386 dracut[1241]: *** Generating early-microcode cpio image ***
Nov 25 05:37:08 np0005534386 dracut[1241]: *** Store current command line parameters ***
Nov 25 05:37:08 np0005534386 dracut[1241]: Stored kernel commandline:
Nov 25 05:37:08 np0005534386 dracut[1241]: No dracut internal kernel commandline stored in the initramfs
Nov 25 05:37:08 np0005534386 dracut[1241]: *** Install squash loader ***
Nov 25 05:37:08 np0005534386 dracut[1241]: *** Squashing the files inside the initramfs ***
Nov 25 05:37:10 np0005534386 dracut[1241]: *** Squashing the files inside the initramfs done ***
Nov 25 05:37:10 np0005534386 dracut[1241]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 25 05:37:10 np0005534386 dracut[1241]: *** Hardlinking files ***
Nov 25 05:37:10 np0005534386 dracut[1241]: Mode:           real
Nov 25 05:37:10 np0005534386 dracut[1241]: Files:          50
Nov 25 05:37:10 np0005534386 dracut[1241]: Linked:         0 files
Nov 25 05:37:10 np0005534386 dracut[1241]: Compared:       0 xattrs
Nov 25 05:37:10 np0005534386 dracut[1241]: Compared:       0 files
Nov 25 05:37:10 np0005534386 dracut[1241]: Saved:          0 B
Nov 25 05:37:10 np0005534386 dracut[1241]: Duration:       0.000381 seconds
Nov 25 05:37:10 np0005534386 dracut[1241]: *** Hardlinking files done ***
Nov 25 05:37:10 np0005534386 dracut[1241]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 25 05:37:10 np0005534386 kdumpctl[979]: kdump: kexec: loaded kdump kernel
Nov 25 05:37:10 np0005534386 kdumpctl[979]: kdump: Starting kdump: [OK]
Nov 25 05:37:10 np0005534386 systemd[1]: Finished Crash recovery kernel arming.
Nov 25 05:37:10 np0005534386 systemd[1]: Startup finished in 1.200s (kernel) + 2.013s (initrd) + 15.951s (userspace) = 19.165s.
Nov 25 05:37:11 np0005534386 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 05:37:27 np0005534386 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 05:38:25 np0005534386 sshd-session[4368]: Accepted publickey for zuul from 192.168.26.12 port 58116 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 25 05:38:25 np0005534386 systemd[1]: Created slice User Slice of UID 1000.
Nov 25 05:38:25 np0005534386 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 25 05:38:25 np0005534386 systemd-logind[744]: New session 1 of user zuul.
Nov 25 05:38:25 np0005534386 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 25 05:38:25 np0005534386 systemd[1]: Starting User Manager for UID 1000...
Nov 25 05:38:25 np0005534386 systemd[4372]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:38:25 np0005534386 systemd[4372]: Queued start job for default target Main User Target.
Nov 25 05:38:25 np0005534386 systemd[4372]: Created slice User Application Slice.
Nov 25 05:38:25 np0005534386 systemd[4372]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 05:38:25 np0005534386 systemd[4372]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 05:38:25 np0005534386 systemd[4372]: Reached target Paths.
Nov 25 05:38:25 np0005534386 systemd[4372]: Reached target Timers.
Nov 25 05:38:25 np0005534386 systemd[4372]: Starting D-Bus User Message Bus Socket...
Nov 25 05:38:25 np0005534386 systemd[4372]: Starting Create User's Volatile Files and Directories...
Nov 25 05:38:25 np0005534386 systemd[4372]: Listening on D-Bus User Message Bus Socket.
Nov 25 05:38:25 np0005534386 systemd[4372]: Reached target Sockets.
Nov 25 05:38:25 np0005534386 systemd[4372]: Finished Create User's Volatile Files and Directories.
Nov 25 05:38:25 np0005534386 systemd[4372]: Reached target Basic System.
Nov 25 05:38:25 np0005534386 systemd[4372]: Reached target Main User Target.
Nov 25 05:38:25 np0005534386 systemd[4372]: Startup finished in 79ms.
Nov 25 05:38:25 np0005534386 systemd[1]: Started User Manager for UID 1000.
Nov 25 05:38:25 np0005534386 systemd[1]: Started Session 1 of User zuul.
Nov 25 05:38:25 np0005534386 sshd-session[4368]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:38:26 np0005534386 python3[4454]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 05:38:27 np0005534386 python3[4482]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 05:38:31 np0005534386 python3[4536]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 05:38:32 np0005534386 python3[4576]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 25 05:38:33 np0005534386 python3[4602]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDpvlFx9+V6V/WcsjHIxw0QC8khxJHm2u1+NhHe3MkSESePIKQkLJEO/qzFfQGQXsOTp42hy3pttHfQlTxariFRSDZYmASaY96d+iEq2eo4hIiXmeWJy4yeHS9ysbu19HMp7ByCfENFDPef/NpBZOMc2ZAZeg3THG/oghLW1I9Ht+uDs8JHin5gPiGZj3YYuwvHFhOjerzDRj1SeTwHwqjenro7y2tAdXxdqsjWxUL+0trdNBNj0A9VHmJgVqraL1pTPJ550CwgEcWD0URHjRgci8QruQrQj532iCAMJ+kgjmwnBODITGOr2Y1zW61CfWCsiZS7liSiWMFWqzMGgx3OOU12Xijy7naoJgaYejf7j2YXj6fFt0tf/9vrcQinT9k97kuF3HkllfPD4Vr9NVZypP7rdTdjN4gTlAt97d2MJBD1W9NxJULEE1yRdmmg+TcstoP4XVol0BwvyUROEIUdbJim3PdcUKx/bCnsahvx6HzvFzflcSafAbtxse2ya40= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:34 np0005534386 python3[4626]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:34 np0005534386 python3[4725]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:38:34 np0005534386 python3[4796]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764049114.2995086-207-200269649692906/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=38f3e1fde2e54927936c749602139767_id_rsa follow=False checksum=a3b063953b4d9fefc8182f1d48a97bf9d117495b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:35 np0005534386 python3[4919]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:38:35 np0005534386 python3[4990]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764049114.896086-240-236792691915822/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=38f3e1fde2e54927936c749602139767_id_rsa.pub follow=False checksum=3f1fe54b28df19b7796fb07f90fb36230dea26cd backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:36 np0005534386 python3[5038]: ansible-ping Invoked with data=pong
Nov 25 05:38:36 np0005534386 python3[5062]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 05:38:38 np0005534386 python3[5116]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 25 05:38:39 np0005534386 python3[5148]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:39 np0005534386 python3[5172]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:39 np0005534386 python3[5196]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:39 np0005534386 python3[5220]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:39 np0005534386 python3[5244]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:39 np0005534386 python3[5268]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:41 np0005534386 sudo[5292]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrziesebqhabbhnioqcrzsxuutdimcss ; /usr/bin/python3'
Nov 25 05:38:41 np0005534386 sudo[5292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:41 np0005534386 python3[5294]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:41 np0005534386 sudo[5292]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:41 np0005534386 sudo[5370]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msceflaircfvnmzubygvqwvbcletuplw ; /usr/bin/python3'
Nov 25 05:38:41 np0005534386 sudo[5370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:41 np0005534386 python3[5372]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:38:41 np0005534386 sudo[5370]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:41 np0005534386 sudo[5443]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osethsjwlvvrjqmypcksdxhssmhcglrq ; /usr/bin/python3'
Nov 25 05:38:41 np0005534386 sudo[5443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:41 np0005534386 python3[5445]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764049121.2404644-21-233984603274495/source follow=False _original_basename=mirror_info.sh.j2 checksum=3f92644b791816833989d215b9a84c589a7b8ebd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:41 np0005534386 sudo[5443]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:42 np0005534386 python3[5493]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:42 np0005534386 python3[5517]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:42 np0005534386 python3[5541]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:42 np0005534386 python3[5565]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:43 np0005534386 python3[5589]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:43 np0005534386 python3[5613]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:43 np0005534386 python3[5637]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:43 np0005534386 python3[5661]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:43 np0005534386 python3[5685]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:44 np0005534386 python3[5709]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:44 np0005534386 python3[5733]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:44 np0005534386 python3[5757]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:44 np0005534386 python3[5781]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:44 np0005534386 python3[5805]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:44 np0005534386 python3[5829]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:45 np0005534386 python3[5853]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:45 np0005534386 python3[5877]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:45 np0005534386 python3[5901]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:45 np0005534386 python3[5925]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:45 np0005534386 python3[5949]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:46 np0005534386 python3[5973]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:46 np0005534386 python3[5997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:46 np0005534386 python3[6021]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:46 np0005534386 python3[6045]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:46 np0005534386 python3[6069]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:47 np0005534386 python3[6093]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:38:49 np0005534386 sudo[6117]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdfzubysgsjqydifgyqdgefilmyoufpz ; /usr/bin/python3'
Nov 25 05:38:49 np0005534386 sudo[6117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:49 np0005534386 python3[6119]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 25 05:38:49 np0005534386 systemd[1]: Starting Time & Date Service...
Nov 25 05:38:49 np0005534386 systemd[1]: Started Time & Date Service.
Nov 25 05:38:49 np0005534386 systemd-timedated[6121]: Changed time zone to 'UTC' (UTC).
Nov 25 05:38:49 np0005534386 sudo[6117]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:49 np0005534386 sudo[6148]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjzhlznxdmsbsomugdqygpvwqsgrysid ; /usr/bin/python3'
Nov 25 05:38:49 np0005534386 sudo[6148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:50 np0005534386 python3[6150]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:50 np0005534386 sudo[6148]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:50 np0005534386 python3[6226]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:38:50 np0005534386 python3[6297]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764049130.1950529-153-277847283013663/source _original_basename=tmpflphdceg follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:50 np0005534386 python3[6397]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:38:51 np0005534386 python3[6468]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764049130.7804863-183-101051896190477/source _original_basename=tmp1pkx3gw8 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:51 np0005534386 sudo[6568]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-augvdqjxdvhxpjykkfvdsivxwacxyqjg ; /usr/bin/python3'
Nov 25 05:38:51 np0005534386 sudo[6568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:51 np0005534386 python3[6570]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:38:51 np0005534386 sudo[6568]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:51 np0005534386 sudo[6641]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuiqabvosogitlmvcqujwpqfnmbilfng ; /usr/bin/python3'
Nov 25 05:38:51 np0005534386 sudo[6641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:51 np0005534386 python3[6643]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764049131.5394697-231-98201657740431/source _original_basename=tmpw1hsx_5c follow=False checksum=a7891c19fa72454f8162d76f201193d911cf12a5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:51 np0005534386 sudo[6641]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:52 np0005534386 python3[6691]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:38:52 np0005534386 python3[6717]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:38:52 np0005534386 sudo[6795]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnzkpzkomatbbpsxpvnhoigldskxxqjk ; /usr/bin/python3'
Nov 25 05:38:52 np0005534386 sudo[6795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:52 np0005534386 python3[6797]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:38:52 np0005534386 sudo[6795]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:52 np0005534386 sudo[6868]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtjqsdmtnfxqggstncwssrckkgdrogxr ; /usr/bin/python3'
Nov 25 05:38:52 np0005534386 sudo[6868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:53 np0005534386 python3[6870]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764049132.6998332-273-69687165948468/source _original_basename=tmpqusup5or follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:38:53 np0005534386 sudo[6868]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:53 np0005534386 sudo[6919]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhjyjghnmvauvtdmlrfhwjobpslhlzsn ; /usr/bin/python3'
Nov 25 05:38:53 np0005534386 sudo[6919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:38:53 np0005534386 python3[6921]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e08-49e2-dbfb-6923-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:38:53 np0005534386 sudo[6919]: pam_unix(sudo:session): session closed for user root
Nov 25 05:38:54 np0005534386 python3[6949]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                             _uses_shell=True zuul_log_id=fa163e08-49e2-dbfb-6923-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 25 05:38:55 np0005534386 python3[6977]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:39:10 np0005534386 sudo[7001]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzldsnaooftzxdhhfbqzijdmplordzjn ; /usr/bin/python3'
Nov 25 05:39:10 np0005534386 sudo[7001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:39:10 np0005534386 python3[7003]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:39:10 np0005534386 sudo[7001]: pam_unix(sudo:session): session closed for user root
Nov 25 05:39:19 np0005534386 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 25 05:39:33 np0005534386 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Nov 25 05:39:33 np0005534386 kernel: pci 0000:07:00.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 25 05:39:33 np0005534386 kernel: pci 0000:07:00.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 25 05:39:33 np0005534386 kernel: pci 0000:07:00.0: ROM [mem 0x00000000-0x0003ffff pref]
Nov 25 05:39:33 np0005534386 kernel: pci 0000:07:00.0: ROM [mem 0xfe000000-0xfe03ffff pref]: assigned
Nov 25 05:39:33 np0005534386 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfb600000-0xfb603fff 64bit pref]: assigned
Nov 25 05:39:33 np0005534386 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfe040000-0xfe040fff]: assigned
Nov 25 05:39:33 np0005534386 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002)
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4648] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 05:39:33 np0005534386 systemd-udevd[7007]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4773] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4793] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4797] device (eth1): carrier: link connected
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4800] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4805] policy: auto-activating connection 'Wired connection 1' (a2275460-ad3e-3060-b256-d912ae2c7b1b)
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4808] device (eth1): Activation: starting connection 'Wired connection 1' (a2275460-ad3e-3060-b256-d912ae2c7b1b)
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4809] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4813] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4817] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 05:39:33 np0005534386 NetworkManager[811]: <info>  [1764049173.4821] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:39:34 np0005534386 python3[7033]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e08-49e2-5d7d-55de-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:39:40 np0005534386 sudo[7111]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owtsujuqyzmeimzqcmrhudpnbvbytdex ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Nov 25 05:39:40 np0005534386 sudo[7111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:39:40 np0005534386 python3[7113]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:39:40 np0005534386 sudo[7111]: pam_unix(sudo:session): session closed for user root
Nov 25 05:39:40 np0005534386 sudo[7184]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxoutbdicfwjltzngxdflxkufdkiwqkw ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Nov 25 05:39:40 np0005534386 sudo[7184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:39:40 np0005534386 python3[7186]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764049180.5484755-111-148617451098579/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=7d09a049e2499075540ad2552c6275d6fcac7f94 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:39:40 np0005534386 sudo[7184]: pam_unix(sudo:session): session closed for user root
Nov 25 05:39:41 np0005534386 sudo[7234]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlxxtsqqlywwtfodexlslrxqylzhpvde ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Nov 25 05:39:41 np0005534386 sudo[7234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:39:41 np0005534386 python3[7236]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 05:39:41 np0005534386 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 25 05:39:41 np0005534386 systemd[1]: Stopped Network Manager Wait Online.
Nov 25 05:39:41 np0005534386 systemd[1]: Stopping Network Manager Wait Online...
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.4988] caught SIGTERM, shutting down normally.
Nov 25 05:39:41 np0005534386 systemd[1]: Stopping Network Manager...
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.4996] dhcp4 (eth0): canceled DHCP transaction
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.4996] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.4996] dhcp4 (eth0): state changed no lease
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.4998] dhcp6 (eth0): canceled DHCP transaction
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.4998] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.4998] dhcp6 (eth0): state changed no lease
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.5000] manager: NetworkManager state is now CONNECTING
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.5084] dhcp4 (eth1): canceled DHCP transaction
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.5084] dhcp4 (eth1): state changed no lease
Nov 25 05:39:41 np0005534386 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 05:39:41 np0005534386 NetworkManager[811]: <info>  [1764049181.5104] exiting (success)
Nov 25 05:39:41 np0005534386 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 05:39:41 np0005534386 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 25 05:39:41 np0005534386 systemd[1]: Stopped Network Manager.
Nov 25 05:39:41 np0005534386 systemd[1]: Starting Network Manager...
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.5476] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:724c5288-952e-4df2-81c9-c4395c2e16fd)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.5477] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.5516] manager[0x56113d0ea090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 05:39:41 np0005534386 systemd[1]: Starting Hostname Service...
Nov 25 05:39:41 np0005534386 systemd[1]: Started Hostname Service.
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6030] hostname: hostname: using hostnamed
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6030] hostname: static hostname changed from (none) to "np0005534386"
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6032] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6035] manager[0x56113d0ea090]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6035] manager[0x56113d0ea090]: rfkill: WWAN hardware radio set enabled
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6052] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6052] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6052] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6052] manager: Networking is enabled by state file
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6054] settings: Loaded settings plugin: keyfile (internal)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6056] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6071] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6077] dhcp: init: Using DHCP client 'internal'
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6078] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6081] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6085] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6089] device (lo): Activation: starting connection 'lo' (38cce539-3d4c-4266-b4bb-4a3c7b88c026)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6093] device (eth0): carrier: link connected
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6096] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6099] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6099] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6103] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6107] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6110] device (eth1): carrier: link connected
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6113] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6117] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (a2275460-ad3e-3060-b256-d912ae2c7b1b) (indicated)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6117] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6120] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6124] device (eth1): Activation: starting connection 'Wired connection 1' (a2275460-ad3e-3060-b256-d912ae2c7b1b)
Nov 25 05:39:41 np0005534386 systemd[1]: Started Network Manager.
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6128] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6130] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6131] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6132] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6133] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6145] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6147] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6148] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6149] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6159] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6161] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6163] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6165] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6169] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6171] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:39:41 np0005534386 systemd[1]: Starting Network Manager Wait Online...
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6192] dhcp4 (eth0): state changed new lease, address=192.168.26.115
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6197] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6262] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6265] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 05:39:41 np0005534386 NetworkManager[7248]: <info>  [1764049181.6271] device (lo): Activation: successful, device activated.
Nov 25 05:39:41 np0005534386 sudo[7234]: pam_unix(sudo:session): session closed for user root
Nov 25 05:39:41 np0005534386 python3[7308]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e08-49e2-5d7d-55de-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:39:42 np0005534386 NetworkManager[7248]: <info>  [1764049182.7127] dhcp6 (eth0): state changed new lease, address=2001:db8::331
Nov 25 05:39:42 np0005534386 NetworkManager[7248]: <info>  [1764049182.7134] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 05:39:42 np0005534386 NetworkManager[7248]: <info>  [1764049182.7156] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 05:39:42 np0005534386 NetworkManager[7248]: <info>  [1764049182.7157] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 05:39:42 np0005534386 NetworkManager[7248]: <info>  [1764049182.7158] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 05:39:42 np0005534386 NetworkManager[7248]: <info>  [1764049182.7160] device (eth0): Activation: successful, device activated.
Nov 25 05:39:42 np0005534386 NetworkManager[7248]: <info>  [1764049182.7163] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 05:39:52 np0005534386 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 05:40:11 np0005534386 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 05:40:18 np0005534386 sudo[7407]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtnegfxddnzyppbogrgvcjfmyurhymgo ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Nov 25 05:40:18 np0005534386 sudo[7407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:40:18 np0005534386 python3[7409]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:40:18 np0005534386 sudo[7407]: pam_unix(sudo:session): session closed for user root
Nov 25 05:40:18 np0005534386 sudo[7480]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wreuhhyluvdydwnrgihvqroutrdkwugv ; OS_CLOUD=ibm-bm4-nodepool /usr/bin/python3'
Nov 25 05:40:18 np0005534386 sudo[7480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:40:18 np0005534386 python3[7482]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764049218.0536034-265-192552325956247/source _original_basename=tmp_eo0bttj follow=False checksum=3426a992f8d2481a8d0e09553152c2c6edba8fd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:40:18 np0005534386 sudo[7480]: pam_unix(sudo:session): session closed for user root
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5585] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 05:40:26 np0005534386 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 05:40:26 np0005534386 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5792] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5794] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5797] device (eth1): Activation: successful, device activated.
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5800] manager: startup complete
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5801] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <warn>  [1764049226.5803] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5807] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 25 05:40:26 np0005534386 systemd[1]: Finished Network Manager Wait Online.
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5854] dhcp4 (eth1): canceled DHCP transaction
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5854] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5854] dhcp4 (eth1): state changed no lease
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5865] policy: auto-activating connection 'ci-private-network' (4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4)
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5868] device (eth1): Activation: starting connection 'ci-private-network' (4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4)
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5869] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5870] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5875] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5881] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5904] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5905] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 05:40:26 np0005534386 NetworkManager[7248]: <info>  [1764049226.5910] device (eth1): Activation: successful, device activated.
Nov 25 05:40:36 np0005534386 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 05:40:49 np0005534386 systemd[4372]: Starting Mark boot as successful...
Nov 25 05:40:49 np0005534386 systemd[4372]: Finished Mark boot as successful.
Nov 25 05:41:18 np0005534386 sshd-session[4381]: Received disconnect from 192.168.26.12 port 58116:11: disconnected by user
Nov 25 05:41:18 np0005534386 sshd-session[4381]: Disconnected from user zuul 192.168.26.12 port 58116
Nov 25 05:41:18 np0005534386 sshd-session[4368]: pam_unix(sshd:session): session closed for user zuul
Nov 25 05:41:18 np0005534386 systemd-logind[744]: Session 1 logged out. Waiting for processes to exit.
Nov 25 05:43:49 np0005534386 systemd[4372]: Created slice User Background Tasks Slice.
Nov 25 05:43:49 np0005534386 systemd[4372]: Starting Cleanup of User's Temporary Files and Directories...
Nov 25 05:43:49 np0005534386 systemd[4372]: Finished Cleanup of User's Temporary Files and Directories.
Nov 25 05:44:16 np0005534386 sshd-session[7534]: Accepted publickey for zuul from 192.168.26.12 port 59716 ssh2: RSA SHA256:4jB/2RHaNWlrspEB71DzZDzac0RbBWY2hKV/rLNCB+0
Nov 25 05:44:16 np0005534386 systemd-logind[744]: New session 3 of user zuul.
Nov 25 05:44:16 np0005534386 systemd[1]: Started Session 3 of User zuul.
Nov 25 05:44:16 np0005534386 sshd-session[7534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:44:16 np0005534386 sudo[7561]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loryfjcmjhffocgqcophamyokdmopbtk ; /usr/bin/python3'
Nov 25 05:44:16 np0005534386 sudo[7561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:16 np0005534386 python3[7563]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                             _uses_shell=True zuul_log_id=fa163e08-49e2-d8eb-05fd-000000001cc2-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:44:16 np0005534386 sudo[7561]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:16 np0005534386 sudo[7590]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etkxgpysubfgnsmmrdwqdvldlvzlypzp ; /usr/bin/python3'
Nov 25 05:44:16 np0005534386 sudo[7590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:17 np0005534386 python3[7592]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:44:17 np0005534386 sudo[7590]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:17 np0005534386 sudo[7616]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkyzaqiskuwkuazagpddmocflllpoafe ; /usr/bin/python3'
Nov 25 05:44:17 np0005534386 sudo[7616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:17 np0005534386 python3[7618]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:44:17 np0005534386 sudo[7616]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:17 np0005534386 sudo[7642]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjsqblmidxeorqbyqbmtnzgywmoyuvci ; /usr/bin/python3'
Nov 25 05:44:17 np0005534386 sudo[7642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:17 np0005534386 python3[7644]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:44:17 np0005534386 sudo[7642]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:17 np0005534386 sudo[7668]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxqmeersbpkjdmqzvznbmshwdueinrrm ; /usr/bin/python3'
Nov 25 05:44:17 np0005534386 sudo[7668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:17 np0005534386 python3[7670]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:44:17 np0005534386 sudo[7668]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:17 np0005534386 sudo[7694]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tekvtwsiywhbiezailclpxjvjosyhakd ; /usr/bin/python3'
Nov 25 05:44:17 np0005534386 sudo[7694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:18 np0005534386 python3[7696]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:44:18 np0005534386 sudo[7694]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:18 np0005534386 sudo[7772]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbccsyqmdcpftgwkkabquabcvsebnanq ; /usr/bin/python3'
Nov 25 05:44:18 np0005534386 sudo[7772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:18 np0005534386 python3[7774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:44:18 np0005534386 sudo[7772]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:18 np0005534386 sudo[7845]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ishmalnffmjjoidofcwnvmgdkcvmeiqk ; /usr/bin/python3'
Nov 25 05:44:18 np0005534386 sudo[7845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:18 np0005534386 python3[7847]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764049458.409201-464-146243923599433/source _original_basename=tmpuj3g89f1 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:44:18 np0005534386 sudo[7845]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:19 np0005534386 sudo[7895]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjqebugcfxyfthqkyhcownrxchslfrfc ; /usr/bin/python3'
Nov 25 05:44:19 np0005534386 sudo[7895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:19 np0005534386 python3[7897]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 05:44:19 np0005534386 systemd[1]: Reloading.
Nov 25 05:44:19 np0005534386 systemd-rc-local-generator[7915]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 05:44:19 np0005534386 sudo[7895]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:20 np0005534386 sudo[7951]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydqublfakvppiipqokmkjziweomobykg ; /usr/bin/python3'
Nov 25 05:44:20 np0005534386 sudo[7951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:20 np0005534386 python3[7953]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 25 05:44:20 np0005534386 sudo[7951]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:21 np0005534386 sudo[7977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnfqxgdxjkcgooajqfesnazkmbozfmkx ; /usr/bin/python3'
Nov 25 05:44:21 np0005534386 sudo[7977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:21 np0005534386 python3[7979]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:44:21 np0005534386 sudo[7977]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:21 np0005534386 sudo[8005]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnblgmmqbiyflbdcfzgjpmigvurlfzjs ; /usr/bin/python3'
Nov 25 05:44:21 np0005534386 sudo[8005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:21 np0005534386 python3[8007]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:44:21 np0005534386 sudo[8005]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:21 np0005534386 sudo[8033]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znqrqshahlgixujrooxykkodideaivxy ; /usr/bin/python3'
Nov 25 05:44:21 np0005534386 sudo[8033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:21 np0005534386 python3[8035]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:44:21 np0005534386 sudo[8033]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:21 np0005534386 sudo[8061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uouikergdiibmoncwmigxznenghgptun ; /usr/bin/python3'
Nov 25 05:44:21 np0005534386 sudo[8061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:21 np0005534386 python3[8063]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:44:21 np0005534386 sudo[8061]: pam_unix(sudo:session): session closed for user root
Nov 25 05:44:22 np0005534386 python3[8090]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                             _uses_shell=True zuul_log_id=fa163e08-49e2-d8eb-05fd-000000001cc9-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:44:22 np0005534386 python3[8120]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 25 05:44:24 np0005534386 sshd-session[7537]: Connection closed by 192.168.26.12 port 59716
Nov 25 05:44:24 np0005534386 sshd-session[7534]: pam_unix(sshd:session): session closed for user zuul
Nov 25 05:44:24 np0005534386 systemd[1]: session-3.scope: Deactivated successfully.
Nov 25 05:44:24 np0005534386 systemd[1]: session-3.scope: Consumed 2.926s CPU time.
Nov 25 05:44:24 np0005534386 systemd-logind[744]: Session 3 logged out. Waiting for processes to exit.
Nov 25 05:44:24 np0005534386 systemd-logind[744]: Removed session 3.
Nov 25 05:44:25 np0005534386 sshd-session[8125]: Accepted publickey for zuul from 192.168.26.12 port 42776 ssh2: RSA SHA256:4jB/2RHaNWlrspEB71DzZDzac0RbBWY2hKV/rLNCB+0
Nov 25 05:44:25 np0005534386 systemd-logind[744]: New session 4 of user zuul.
Nov 25 05:44:25 np0005534386 systemd[1]: Started Session 4 of User zuul.
Nov 25 05:44:25 np0005534386 sshd-session[8125]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:44:26 np0005534386 sudo[8152]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxsvjnmfhicsrgkssvquykpnquvdgjoe ; /usr/bin/python3'
Nov 25 05:44:26 np0005534386 sudo[8152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:44:26 np0005534386 python3[8154]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 25 05:44:36 np0005534386 chronyd[753]: Selected source 64.79.100.197 (2.centos.pool.ntp.org)
Nov 25 05:44:49 np0005534386 kernel: SELinux:  Converting 386 SID table entries...
Nov 25 05:44:49 np0005534386 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 05:44:49 np0005534386 kernel: SELinux:  policy capability open_perms=1
Nov 25 05:44:49 np0005534386 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 05:44:49 np0005534386 kernel: SELinux:  policy capability always_check_network=0
Nov 25 05:44:49 np0005534386 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 05:44:49 np0005534386 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 05:44:49 np0005534386 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 05:44:56 np0005534386 kernel: SELinux:  Converting 386 SID table entries...
Nov 25 05:44:56 np0005534386 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 05:44:56 np0005534386 kernel: SELinux:  policy capability open_perms=1
Nov 25 05:44:56 np0005534386 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 05:44:56 np0005534386 kernel: SELinux:  policy capability always_check_network=0
Nov 25 05:44:56 np0005534386 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 05:44:56 np0005534386 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 05:44:56 np0005534386 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 05:45:02 np0005534386 kernel: SELinux:  Converting 386 SID table entries...
Nov 25 05:45:02 np0005534386 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 05:45:02 np0005534386 kernel: SELinux:  policy capability open_perms=1
Nov 25 05:45:02 np0005534386 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 05:45:02 np0005534386 kernel: SELinux:  policy capability always_check_network=0
Nov 25 05:45:02 np0005534386 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 05:45:02 np0005534386 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 05:45:02 np0005534386 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 05:45:03 np0005534386 setsebool[8223]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 25 05:45:03 np0005534386 setsebool[8223]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 25 05:45:11 np0005534386 kernel: SELinux:  Converting 389 SID table entries...
Nov 25 05:45:11 np0005534386 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 05:45:11 np0005534386 kernel: SELinux:  policy capability open_perms=1
Nov 25 05:45:11 np0005534386 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 05:45:11 np0005534386 kernel: SELinux:  policy capability always_check_network=0
Nov 25 05:45:11 np0005534386 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 05:45:11 np0005534386 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 05:45:11 np0005534386 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 05:45:23 np0005534386 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 25 05:45:23 np0005534386 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 05:45:23 np0005534386 systemd[1]: Starting man-db-cache-update.service...
Nov 25 05:45:23 np0005534386 systemd[1]: Reloading.
Nov 25 05:45:24 np0005534386 systemd-rc-local-generator[8971]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 05:45:24 np0005534386 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 05:45:24 np0005534386 sudo[8152]: pam_unix(sudo:session): session closed for user root
Nov 25 05:45:25 np0005534386 python3[10754]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                              _uses_shell=True zuul_log_id=fa163e08-49e2-b268-3b05-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:45:25 np0005534386 kernel: evm: overlay not supported
Nov 25 05:45:25 np0005534386 systemd[4372]: Starting D-Bus User Message Bus...
Nov 25 05:45:25 np0005534386 dbus-broker-launch[12019]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 25 05:45:25 np0005534386 systemd[4372]: Started D-Bus User Message Bus.
Nov 25 05:45:25 np0005534386 dbus-broker-launch[12019]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 25 05:45:25 np0005534386 dbus-broker-lau[12019]: Ready
Nov 25 05:45:25 np0005534386 systemd[4372]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 25 05:45:25 np0005534386 systemd[4372]: Created slice Slice /user.
Nov 25 05:45:25 np0005534386 systemd[4372]: podman-11933.scope: unit configures an IP firewall, but not running as root.
Nov 25 05:45:25 np0005534386 systemd[4372]: (This warning is only shown for the first unit using IP firewalling.)
Nov 25 05:45:25 np0005534386 systemd[4372]: Started podman-11933.scope.
Nov 25 05:45:26 np0005534386 systemd[4372]: Started podman-pause-412aaaa8.scope.
Nov 25 05:45:27 np0005534386 sshd-session[8128]: Connection closed by 192.168.26.12 port 42776
Nov 25 05:45:27 np0005534386 sshd-session[8125]: pam_unix(sshd:session): session closed for user zuul
Nov 25 05:45:27 np0005534386 systemd[1]: session-4.scope: Deactivated successfully.
Nov 25 05:45:27 np0005534386 systemd[1]: session-4.scope: Consumed 43.094s CPU time.
Nov 25 05:45:27 np0005534386 systemd-logind[744]: Session 4 logged out. Waiting for processes to exit.
Nov 25 05:45:27 np0005534386 systemd-logind[744]: Removed session 4.
Nov 25 05:45:36 np0005534386 irqbalance[742]: Cannot change IRQ 47 affinity: Operation not permitted
Nov 25 05:45:36 np0005534386 irqbalance[742]: IRQ 47 affinity is now unmanaged
Nov 25 05:45:42 np0005534386 sshd-session[21878]: Connection closed by 192.168.26.121 port 57474 [preauth]
Nov 25 05:45:42 np0005534386 sshd-session[21880]: Connection closed by 192.168.26.121 port 57480 [preauth]
Nov 25 05:45:42 np0005534386 sshd-session[21881]: Unable to negotiate with 192.168.26.121 port 57490: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 25 05:45:42 np0005534386 sshd-session[21883]: Unable to negotiate with 192.168.26.121 port 57502: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 25 05:45:42 np0005534386 sshd-session[21882]: Unable to negotiate with 192.168.26.121 port 57498: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 25 05:45:50 np0005534386 sshd-session[26848]: Accepted publickey for zuul from 192.168.26.12 port 47056 ssh2: RSA SHA256:4jB/2RHaNWlrspEB71DzZDzac0RbBWY2hKV/rLNCB+0
Nov 25 05:45:50 np0005534386 systemd-logind[744]: New session 5 of user zuul.
Nov 25 05:45:50 np0005534386 systemd[1]: Started Session 5 of User zuul.
Nov 25 05:45:50 np0005534386 sshd-session[26848]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:45:51 np0005534386 python3[26992]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJMloj477qDMdGRXhtldOHKIANzR6JthtZk0+0zjdQfiwWqNeVPcB+7Hyv2C/Xd8ZvDSZKZ9Z8E1J5x7rEgofUQ= zuul@np0005534384
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:45:51 np0005534386 sudo[27153]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gncgbmnrqhpykggpmrfjlgcohjvmosvi ; /usr/bin/python3'
Nov 25 05:45:51 np0005534386 sudo[27153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:45:51 np0005534386 python3[27171]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJMloj477qDMdGRXhtldOHKIANzR6JthtZk0+0zjdQfiwWqNeVPcB+7Hyv2C/Xd8ZvDSZKZ9Z8E1J5x7rEgofUQ= zuul@np0005534384
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:45:51 np0005534386 sudo[27153]: pam_unix(sudo:session): session closed for user root
Nov 25 05:45:51 np0005534386 sudo[27545]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acqoxkkultrqcnhhmraxpmwstwsqdffx ; /usr/bin/python3'
Nov 25 05:45:51 np0005534386 sudo[27545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:45:52 np0005534386 python3[27554]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005534386 update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 25 05:45:52 np0005534386 useradd[27637]: new group: name=cloud-admin, GID=1002
Nov 25 05:45:52 np0005534386 useradd[27637]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 25 05:45:52 np0005534386 sudo[27545]: pam_unix(sudo:session): session closed for user root
Nov 25 05:45:52 np0005534386 sudo[27743]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byntcbbqmgkvahqklxrarynosvhmkgli ; /usr/bin/python3'
Nov 25 05:45:52 np0005534386 sudo[27743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:45:52 np0005534386 python3[27753]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJMloj477qDMdGRXhtldOHKIANzR6JthtZk0+0zjdQfiwWqNeVPcB+7Hyv2C/Xd8ZvDSZKZ9Z8E1J5x7rEgofUQ= zuul@np0005534384
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 05:45:52 np0005534386 sudo[27743]: pam_unix(sudo:session): session closed for user root
Nov 25 05:45:52 np0005534386 sudo[28014]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxbdyvmxwehttvnbwrxbtuvtkbfdosyu ; /usr/bin/python3'
Nov 25 05:45:52 np0005534386 sudo[28014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:45:52 np0005534386 python3[28024]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:45:52 np0005534386 sudo[28014]: pam_unix(sudo:session): session closed for user root
Nov 25 05:45:52 np0005534386 sudo[28292]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvltnmrihhztzqdakybvrwnrxujmctwn ; /usr/bin/python3'
Nov 25 05:45:52 np0005534386 sudo[28292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:45:52 np0005534386 python3[28301]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764049552.4369562-120-54006767092210/source _original_basename=tmpdaq5vetl follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:45:53 np0005534386 sudo[28292]: pam_unix(sudo:session): session closed for user root
Nov 25 05:45:53 np0005534386 sudo[28647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fucsbtyoumitgxwnssshkcxyumqwoyss ; /usr/bin/python3'
Nov 25 05:45:53 np0005534386 sudo[28647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:45:53 np0005534386 python3[28656]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 25 05:45:53 np0005534386 systemd[1]: Starting Hostname Service...
Nov 25 05:45:53 np0005534386 systemd[1]: Started Hostname Service.
Nov 25 05:45:53 np0005534386 systemd-hostnamed[28782]: Changed pretty hostname to 'compute-0'
Nov 25 05:45:53 compute-0 systemd-hostnamed[28782]: Hostname set to <compute-0> (static)
Nov 25 05:45:53 compute-0 NetworkManager[7248]: <info>  [1764049553.7088] hostname: static hostname changed from "np0005534386" to "compute-0"
Nov 25 05:45:53 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 05:45:53 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 05:45:53 compute-0 sudo[28647]: pam_unix(sudo:session): session closed for user root
Nov 25 05:45:54 compute-0 sshd-session[26920]: Connection closed by 192.168.26.12 port 47056
Nov 25 05:45:54 compute-0 sshd-session[26848]: pam_unix(sshd:session): session closed for user zuul
Nov 25 05:45:54 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Nov 25 05:45:54 compute-0 systemd[1]: session-5.scope: Consumed 1.703s CPU time.
Nov 25 05:45:54 compute-0 systemd-logind[744]: Session 5 logged out. Waiting for processes to exit.
Nov 25 05:45:54 compute-0 systemd-logind[744]: Removed session 5.
Nov 25 05:45:55 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 05:45:55 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 05:45:55 compute-0 systemd[1]: man-db-cache-update.service: Consumed 39.365s CPU time.
Nov 25 05:45:55 compute-0 systemd[1]: run-r184e4c9a90df486a864524decb34a1fd.service: Deactivated successfully.
Nov 25 05:46:03 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 05:46:23 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 05:49:03 compute-0 sshd-session[29947]: Accepted publickey for zuul from 192.168.26.121 port 59618 ssh2: RSA SHA256:4jB/2RHaNWlrspEB71DzZDzac0RbBWY2hKV/rLNCB+0
Nov 25 05:49:03 compute-0 systemd-logind[744]: New session 6 of user zuul.
Nov 25 05:49:03 compute-0 systemd[1]: Started Session 6 of User zuul.
Nov 25 05:49:03 compute-0 sshd-session[29947]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:49:03 compute-0 python3[30023]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 05:49:04 compute-0 sudo[30133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqilsfmjfpxypalruyhvrzijqokzggkq ; /usr/bin/python3'
Nov 25 05:49:04 compute-0 sudo[30133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:04 compute-0 python3[30135]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:49:04 compute-0 sudo[30133]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:04 compute-0 sudo[30206]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbdddofoipzlhzpewrjigovxpxnsmcqc ; /usr/bin/python3'
Nov 25 05:49:04 compute-0 sudo[30206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:05 compute-0 python3[30208]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764049744.5717337-34081-92951079198689/source mode=0755 _original_basename=delorean.repo follow=False checksum=e3b95b7f09d348e5957d77e420d5c5e8ddfded18 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:49:05 compute-0 sudo[30206]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:05 compute-0 sudo[30232]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzqsugefbynbizcedxdreiwmyrpvfgzi ; /usr/bin/python3'
Nov 25 05:49:05 compute-0 sudo[30232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:05 compute-0 python3[30234]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:49:05 compute-0 sudo[30232]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:05 compute-0 sudo[30305]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gklcvkqrpcdpbvhgaueohdgmmqjkwmkn ; /usr/bin/python3'
Nov 25 05:49:05 compute-0 sudo[30305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:05 compute-0 python3[30307]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764049744.5717337-34081-92951079198689/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=db3e4c64bdaa76558c4508aa678d2859474cade7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:49:05 compute-0 sudo[30305]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:05 compute-0 sudo[30331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duoltuqmnsywoqpcrykoocntaywqnppd ; /usr/bin/python3'
Nov 25 05:49:05 compute-0 sudo[30331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:05 compute-0 python3[30333]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:49:05 compute-0 sudo[30331]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:05 compute-0 sudo[30404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddvseynpepahqhfefuuvieohtoyfhwwj ; /usr/bin/python3'
Nov 25 05:49:05 compute-0 sudo[30404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:05 compute-0 python3[30406]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764049744.5717337-34081-92951079198689/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=8163d09913b97597f86e38eb45c3003e91da783e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:49:05 compute-0 sudo[30404]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:05 compute-0 sudo[30430]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mltjquuyzorkmuegtaccytkqzzfgzpej ; /usr/bin/python3'
Nov 25 05:49:05 compute-0 sudo[30430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:06 compute-0 python3[30432]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:49:06 compute-0 sudo[30430]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:06 compute-0 sudo[30503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gufmienkwxlpsdyngmuuvyeztvelxkcj ; /usr/bin/python3'
Nov 25 05:49:06 compute-0 sudo[30503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:06 compute-0 python3[30505]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764049744.5717337-34081-92951079198689/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=d108d0750ad5b288ccc41bc6534ea307cc51e987 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:49:06 compute-0 sudo[30503]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:06 compute-0 sudo[30529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbjbuvpuasqvaylxgsrlwtqrxnjzuani ; /usr/bin/python3'
Nov 25 05:49:06 compute-0 sudo[30529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:06 compute-0 python3[30531]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:49:06 compute-0 sudo[30529]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:06 compute-0 sudo[30602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypqhfiwsiaigudifvxkwzdubsqssxqls ; /usr/bin/python3'
Nov 25 05:49:06 compute-0 sudo[30602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:06 compute-0 python3[30604]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764049744.5717337-34081-92951079198689/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=20c3917c672c059a872cf09a437f61890d2f89fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:49:06 compute-0 sudo[30602]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:06 compute-0 sudo[30628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udqhttxkipxdjeqgpwhssjhfrrxwdusd ; /usr/bin/python3'
Nov 25 05:49:06 compute-0 sudo[30628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:06 compute-0 python3[30630]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:49:06 compute-0 sudo[30628]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:07 compute-0 sudo[30701]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqztmqvndslstelvsnuyciagvfzvkcjm ; /usr/bin/python3'
Nov 25 05:49:07 compute-0 sudo[30701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:07 compute-0 python3[30703]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764049744.5717337-34081-92951079198689/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=4d14f168e8a0e6930d905faffbcdf4fedd6664d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:49:07 compute-0 sudo[30701]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:07 compute-0 sudo[30727]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwpnbhxiahnhdaxcmzbrjvqtpliawomk ; /usr/bin/python3'
Nov 25 05:49:07 compute-0 sudo[30727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:07 compute-0 python3[30729]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 05:49:07 compute-0 sudo[30727]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:07 compute-0 sudo[30800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgsumriswleondmqbzmqthnxvydbpntb ; /usr/bin/python3'
Nov 25 05:49:07 compute-0 sudo[30800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:49:07 compute-0 python3[30802]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764049744.5717337-34081-92951079198689/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=0be7eb3bc4775787fd2a5a7ac7bcd314c8e050fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 05:49:07 compute-0 sudo[30800]: pam_unix(sudo:session): session closed for user root
Nov 25 05:49:09 compute-0 sshd-session[30827]: Connection closed by 192.168.122.11 port 33342 [preauth]
Nov 25 05:49:09 compute-0 sshd-session[30828]: Connection closed by 192.168.122.11 port 33348 [preauth]
Nov 25 05:49:09 compute-0 sshd-session[30829]: Unable to negotiate with 192.168.122.11 port 33362: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 25 05:49:09 compute-0 sshd-session[30830]: Unable to negotiate with 192.168.122.11 port 33366: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 25 05:49:09 compute-0 sshd-session[30831]: Unable to negotiate with 192.168.122.11 port 33378: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 25 05:49:13 compute-0 python3[30860]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:52:39 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 25 05:52:39 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 25 05:52:39 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 25 05:52:39 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 25 05:54:13 compute-0 sshd-session[29950]: Received disconnect from 192.168.26.121 port 59618:11: disconnected by user
Nov 25 05:54:13 compute-0 sshd-session[29950]: Disconnected from user zuul 192.168.26.121 port 59618
Nov 25 05:54:13 compute-0 sshd-session[29947]: pam_unix(sshd:session): session closed for user zuul
Nov 25 05:54:13 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 25 05:54:13 compute-0 systemd[1]: session-6.scope: Consumed 3.374s CPU time.
Nov 25 05:54:13 compute-0 systemd-logind[744]: Session 6 logged out. Waiting for processes to exit.
Nov 25 05:54:13 compute-0 systemd-logind[744]: Removed session 6.
Nov 25 05:59:44 compute-0 sshd-session[30865]: Accepted publickey for zuul from 192.168.122.30 port 38878 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 05:59:44 compute-0 systemd-logind[744]: New session 7 of user zuul.
Nov 25 05:59:44 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 25 05:59:44 compute-0 sshd-session[30865]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:59:44 compute-0 python3.9[31018]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 05:59:45 compute-0 sudo[31197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxiwhfoyzkvskzuoooseriushbfmbepu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050385.4138658-32-203628295129390/AnsiballZ_command.py'
Nov 25 05:59:45 compute-0 sudo[31197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 05:59:45 compute-0 python3.9[31199]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 05:59:53 compute-0 sudo[31197]: pam_unix(sudo:session): session closed for user root
Nov 25 05:59:54 compute-0 sshd-session[30868]: Connection closed by 192.168.122.30 port 38878
Nov 25 05:59:54 compute-0 sshd-session[30865]: pam_unix(sshd:session): session closed for user zuul
Nov 25 05:59:54 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 25 05:59:54 compute-0 systemd[1]: session-7.scope: Consumed 5.965s CPU time.
Nov 25 05:59:54 compute-0 systemd-logind[744]: Session 7 logged out. Waiting for processes to exit.
Nov 25 05:59:54 compute-0 systemd-logind[744]: Removed session 7.
Nov 25 05:59:58 compute-0 sshd-session[31256]: Accepted publickey for zuul from 192.168.122.30 port 55856 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 05:59:58 compute-0 systemd-logind[744]: New session 8 of user zuul.
Nov 25 05:59:58 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 25 05:59:58 compute-0 sshd-session[31256]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 05:59:59 compute-0 python3.9[31409]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:00:00 compute-0 sshd-session[31259]: Connection closed by 192.168.122.30 port 55856
Nov 25 06:00:00 compute-0 sshd-session[31256]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:00:00 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 25 06:00:00 compute-0 systemd-logind[744]: Session 8 logged out. Waiting for processes to exit.
Nov 25 06:00:00 compute-0 systemd-logind[744]: Removed session 8.
Nov 25 06:00:15 compute-0 sshd-session[31437]: Accepted publickey for zuul from 192.168.122.30 port 46902 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:00:15 compute-0 systemd-logind[744]: New session 9 of user zuul.
Nov 25 06:00:15 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 25 06:00:15 compute-0 sshd-session[31437]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:00:16 compute-0 python3.9[31590]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 25 06:00:16 compute-0 python3.9[31764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:00:17 compute-0 sudo[31914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkcwuioiawyiywowxcshezpfuedbpoog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050417.1594453-45-266934672801751/AnsiballZ_command.py'
Nov 25 06:00:17 compute-0 sudo[31914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:17 compute-0 python3.9[31916]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:00:17 compute-0 sudo[31914]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:18 compute-0 sudo[32067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btqhkxtinrknaexonzkrhkgsykvijazn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050417.8335161-57-203397433172545/AnsiballZ_stat.py'
Nov 25 06:00:18 compute-0 sudo[32067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:18 compute-0 python3.9[32069]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:00:18 compute-0 sudo[32067]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:18 compute-0 sudo[32219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aayidmkxbbfwljdfmadxzqmutihwxomh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050418.3981535-65-165604416244099/AnsiballZ_file.py'
Nov 25 06:00:18 compute-0 sudo[32219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:18 compute-0 python3.9[32221]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:00:18 compute-0 sudo[32219]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:19 compute-0 sudo[32371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghtawozmxvrrzlaeqxnikrjuymjmawoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050418.9681227-73-20953698333436/AnsiballZ_stat.py'
Nov 25 06:00:19 compute-0 sudo[32371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:19 compute-0 python3.9[32373]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:00:19 compute-0 sudo[32371]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:19 compute-0 sudo[32494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uawklualzzgeejbwdnigyvkhibwcqthd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050418.9681227-73-20953698333436/AnsiballZ_copy.py'
Nov 25 06:00:19 compute-0 sudo[32494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:19 compute-0 python3.9[32496]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050418.9681227-73-20953698333436/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:00:19 compute-0 sudo[32494]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:20 compute-0 sudo[32646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gridgscllefiobffsmqnuhsrrbqogzxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050419.9050725-88-225518068733797/AnsiballZ_setup.py'
Nov 25 06:00:20 compute-0 sudo[32646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:20 compute-0 python3.9[32648]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:00:20 compute-0 sudo[32646]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:20 compute-0 sudo[32802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akpernfrkvdsklabxucvhffdmczyppyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050420.5689425-96-241350208075251/AnsiballZ_file.py'
Nov 25 06:00:20 compute-0 sudo[32802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:20 compute-0 python3.9[32804]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:00:20 compute-0 sudo[32802]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:21 compute-0 sudo[32954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxfuqwindmxdofaidqdywxpndvdihoya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050421.212311-105-160227940818037/AnsiballZ_file.py'
Nov 25 06:00:21 compute-0 sudo[32954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:21 compute-0 python3.9[32956]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:00:21 compute-0 sudo[32954]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:22 compute-0 python3.9[33106]: ansible-ansible.builtin.service_facts Invoked
Nov 25 06:00:25 compute-0 python3.9[33359]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:00:26 compute-0 python3.9[33509]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:00:26 compute-0 python3.9[33663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:00:27 compute-0 sudo[33819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qurziygyqbalnbmvafgoxvvpewutkxws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050427.22427-153-219561286340609/AnsiballZ_setup.py'
Nov 25 06:00:27 compute-0 sudo[33819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:27 compute-0 python3.9[33821]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:00:27 compute-0 sudo[33819]: pam_unix(sudo:session): session closed for user root
Nov 25 06:00:28 compute-0 sudo[33903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dckgfzyvaiskvnvlqhfxbxdvmpienuen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050427.22427-153-219561286340609/AnsiballZ_dnf.py'
Nov 25 06:00:28 compute-0 sudo[33903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:00:28 compute-0 python3.9[33905]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:01:01 compute-0 CROND[33981]: (root) CMD (run-parts /etc/cron.hourly)
Nov 25 06:01:01 compute-0 run-parts[33984]: (/etc/cron.hourly) starting 0anacron
Nov 25 06:01:01 compute-0 anacron[33992]: Anacron started on 2025-11-25
Nov 25 06:01:01 compute-0 anacron[33992]: Will run job `cron.daily' in 50 min.
Nov 25 06:01:01 compute-0 anacron[33992]: Will run job `cron.weekly' in 70 min.
Nov 25 06:01:01 compute-0 anacron[33992]: Will run job `cron.monthly' in 90 min.
Nov 25 06:01:01 compute-0 anacron[33992]: Jobs will be executed sequentially
Nov 25 06:01:01 compute-0 run-parts[33994]: (/etc/cron.hourly) finished 0anacron
Nov 25 06:01:01 compute-0 CROND[33980]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 25 06:01:57 compute-0 systemd[1]: Reloading.
Nov 25 06:01:57 compute-0 systemd-rc-local-generator[34120]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:01:57 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 25 06:01:57 compute-0 systemd[1]: Reloading.
Nov 25 06:01:57 compute-0 systemd-rc-local-generator[34164]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:01:58 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 25 06:01:58 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 25 06:01:58 compute-0 systemd[1]: Reloading.
Nov 25 06:01:58 compute-0 systemd-rc-local-generator[34199]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:01:58 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 25 06:01:58 compute-0 dbus-broker-launch[713]: Noticed file-system modification, trigger reload.
Nov 25 06:01:58 compute-0 dbus-broker-launch[713]: Noticed file-system modification, trigger reload.
Nov 25 06:01:58 compute-0 dbus-broker-launch[713]: Noticed file-system modification, trigger reload.
Nov 25 06:02:43 compute-0 kernel: SELinux:  Converting 2716 SID table entries...
Nov 25 06:02:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 06:02:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 06:02:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 06:02:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 06:02:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 06:02:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 06:02:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 06:02:44 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 25 06:02:44 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 06:02:44 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 06:02:44 compute-0 systemd[1]: Reloading.
Nov 25 06:02:44 compute-0 systemd-rc-local-generator[34499]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:02:44 compute-0 systemd[1]: Starting dnf makecache...
Nov 25 06:02:44 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 06:02:44 compute-0 dnf[34540]: Failed determining last makecache time.
Nov 25 06:02:44 compute-0 sudo[33903]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:44 compute-0 dnf[34540]: delorean-openstack-barbican-42b4c41831408a8e323  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:44 compute-0 dnf[34540]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7  20 kB/s | 3.0 kB     00:00
Nov 25 06:02:44 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 06:02:44 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 06:02:45 compute-0 systemd[1]: run-rbf52e511138841a49ac34081f6a2ec7f.service: Deactivated successfully.
Nov 25 06:02:45 compute-0 sudo[35418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzovphqbgktdfqhzjujvvugdimomhdgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050564.8610384-165-2139740381885/AnsiballZ_command.py'
Nov 25 06:02:45 compute-0 dnf[34540]: delorean-openstack-cinder-1c00d6490d88e436f26ef  22 kB/s | 3.0 kB     00:00
Nov 25 06:02:45 compute-0 sudo[35418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:45 compute-0 dnf[34540]: delorean-python-stevedore-c4acc5639fd2329372142  23 kB/s | 3.0 kB     00:00
Nov 25 06:02:45 compute-0 python3.9[35420]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:02:45 compute-0 dnf[34540]: delorean-python-observabilityclient-2f31846d73c  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:45 compute-0 dnf[34540]: delorean-os-net-config-bbae2ed8a159b0435a473f38  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:45 compute-0 dnf[34540]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6  22 kB/s | 3.0 kB     00:00
Nov 25 06:02:45 compute-0 dnf[34540]: delorean-python-designate-tests-tempest-347fdbc  23 kB/s | 3.0 kB     00:00
Nov 25 06:02:45 compute-0 sudo[35418]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:45 compute-0 dnf[34540]: delorean-openstack-glance-1fd12c29b339f30fe823e  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:46 compute-0 dnf[34540]: delorean-openstack-keystone-e4b40af0ae3698fbbbb  22 kB/s | 3.0 kB     00:00
Nov 25 06:02:46 compute-0 dnf[34540]: delorean-openstack-manila-3c01b7181572c95dac462  23 kB/s | 3.0 kB     00:00
Nov 25 06:02:46 compute-0 dnf[34540]: delorean-python-whitebox-neutron-tests-tempest-  20 kB/s | 3.0 kB     00:00
Nov 25 06:02:46 compute-0 sudo[35709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtzvwtxkpmnenfhmzqcqgaqkuvmldedz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050565.9541175-173-22233976934366/AnsiballZ_selinux.py'
Nov 25 06:02:46 compute-0 sudo[35709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:46 compute-0 dnf[34540]: delorean-openstack-octavia-ba397f07a7331190208c  23 kB/s | 3.0 kB     00:00
Nov 25 06:02:46 compute-0 dnf[34540]: delorean-openstack-watcher-c014f81a8647287f6dcc  22 kB/s | 3.0 kB     00:00
Nov 25 06:02:46 compute-0 python3.9[35711]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 25 06:02:46 compute-0 sudo[35709]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:46 compute-0 dnf[34540]: delorean-python-tcib-1124124ec06aadbac34f0d340b  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:46 compute-0 dnf[34540]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:47 compute-0 dnf[34540]: delorean-openstack-swift-dc98a8463506ac520c469a  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:47 compute-0 sudo[35866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfrfjppatkuawkznuunrmzazgkgqzugd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050566.9053018-184-71125071911893/AnsiballZ_command.py'
Nov 25 06:02:47 compute-0 sudo[35866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:47 compute-0 dnf[34540]: delorean-python-tempestconf-8515371b7cceebd4282  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:47 compute-0 python3.9[35868]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 25 06:02:47 compute-0 dnf[34540]: delorean-openstack-heat-ui-013accbfd179753bc3f0  21 kB/s | 3.0 kB     00:00
Nov 25 06:02:47 compute-0 dnf[34540]: CentOS Stream 9 - BaseOS                         13 kB/s | 6.7 kB     00:00
Nov 25 06:02:47 compute-0 sudo[35866]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:48 compute-0 sudo[36022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiprvwrbvacqrgpunmdcmsmlolyayfbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050567.9691398-192-1536177835394/AnsiballZ_file.py'
Nov 25 06:02:48 compute-0 sudo[36022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:49 compute-0 python3.9[36024]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:02:49 compute-0 sudo[36022]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:49 compute-0 sudo[36174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xueqqezpsrfcpnjzcrxikluiycmbuead ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050569.2147899-200-195548063216173/AnsiballZ_mount.py'
Nov 25 06:02:49 compute-0 sudo[36174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:49 compute-0 python3.9[36176]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 25 06:02:49 compute-0 sudo[36174]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:49 compute-0 dnf[34540]: CentOS Stream 9 - AppStream                     3.7 kB/s | 7.1 kB     00:01
Nov 25 06:02:50 compute-0 sudo[36327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzyyrhhmzvwkgnozqsmeyhvllyfnwhop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050570.3045983-228-25805176317544/AnsiballZ_file.py'
Nov 25 06:02:50 compute-0 sudo[36327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:50 compute-0 python3.9[36329]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:02:50 compute-0 sudo[36327]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:50 compute-0 sudo[36479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqpulbagptsyjiqbstbjteqerwumtire ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050570.7821932-236-237992876371575/AnsiballZ_stat.py'
Nov 25 06:02:50 compute-0 sudo[36479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:51 compute-0 python3.9[36481]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:02:51 compute-0 sudo[36479]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:51 compute-0 sudo[36602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sorzxlcefeqggxksthqlovuprqmfdihl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050570.7821932-236-237992876371575/AnsiballZ_copy.py'
Nov 25 06:02:51 compute-0 sudo[36602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:51 compute-0 python3.9[36604]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050570.7821932-236-237992876371575/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:02:51 compute-0 sudo[36602]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:51 compute-0 dnf[34540]: CentOS Stream 9 - CRB                           3.4 kB/s | 6.6 kB     00:01
Nov 25 06:02:52 compute-0 sudo[36755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntfdbjemqkqwtthexinenghoxzwptmwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050571.8747566-260-148213611409193/AnsiballZ_stat.py'
Nov 25 06:02:52 compute-0 sudo[36755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:52 compute-0 python3.9[36757]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:02:52 compute-0 sudo[36755]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:52 compute-0 dnf[34540]: CentOS Stream 9 - Extras packages                17 kB/s | 8.3 kB     00:00
Nov 25 06:02:52 compute-0 dnf[34540]: dlrn-antelope-testing                            21 kB/s | 3.0 kB     00:00
Nov 25 06:02:52 compute-0 sudo[36909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwaxyvvmfrgxvdbcewpziknblaqvozmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050572.3291147-268-35990219929538/AnsiballZ_command.py'
Nov 25 06:02:52 compute-0 sudo[36909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:52 compute-0 dnf[34540]: dlrn-antelope-build-deps                         21 kB/s | 3.0 kB     00:00
Nov 25 06:02:52 compute-0 python3.9[36911]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:02:52 compute-0 sudo[36909]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:52 compute-0 sudo[37063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkogrfqhctswikbalomqrhsfpkvumkdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050572.8203099-276-140564299208276/AnsiballZ_file.py'
Nov 25 06:02:52 compute-0 sudo[37063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:53 compute-0 python3.9[37065]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:02:53 compute-0 sudo[37063]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:53 compute-0 sudo[37215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmciuwtrqoggehycmapivpvywhnqthex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050573.448456-287-173829867260901/AnsiballZ_getent.py'
Nov 25 06:02:53 compute-0 sudo[37215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:54 compute-0 dnf[34540]: centos9-rabbitmq                                1.5 kB/s | 3.0 kB     00:02
Nov 25 06:02:55 compute-0 dnf[34540]: centos9-storage                                 5.9 kB/s | 3.0 kB     00:00
Nov 25 06:02:55 compute-0 dnf[34540]: centos9-opstools                                7.0 kB/s | 3.0 kB     00:00
Nov 25 06:02:55 compute-0 python3.9[37217]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 25 06:02:55 compute-0 sudo[37215]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:56 compute-0 dnf[34540]: NFV SIG OpenvSwitch                             6.5 kB/s | 3.0 kB     00:00
Nov 25 06:02:56 compute-0 sudo[37377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azkduyrkfiadcdjhygdpeojfoljczlwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050576.0509694-295-68350473982881/AnsiballZ_group.py'
Nov 25 06:02:56 compute-0 sudo[37377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:56 compute-0 python3.9[37379]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 06:02:56 compute-0 dnf[34540]: repo-setup-centos-appstream                      10 kB/s | 4.4 kB     00:00
Nov 25 06:02:56 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 06:02:56 compute-0 groupadd[37380]: group added to /etc/group: name=qemu, GID=107
Nov 25 06:02:56 compute-0 groupadd[37380]: group added to /etc/gshadow: name=qemu
Nov 25 06:02:56 compute-0 groupadd[37380]: new group: name=qemu, GID=107
Nov 25 06:02:56 compute-0 sudo[37377]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:57 compute-0 sudo[37538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dokewdxzmlvrmuuxkcspzeopifjinqoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050576.6639204-303-193022442116310/AnsiballZ_user.py'
Nov 25 06:02:57 compute-0 sudo[37538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:57 compute-0 dnf[34540]: repo-setup-centos-baseos                        9.0 kB/s | 3.9 kB     00:00
Nov 25 06:02:57 compute-0 python3.9[37540]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 06:02:57 compute-0 useradd[37543]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 06:02:57 compute-0 sudo[37538]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:57 compute-0 dnf[34540]: repo-setup-centos-highavailability              9.1 kB/s | 3.9 kB     00:00
Nov 25 06:02:57 compute-0 sudo[37700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxphfmkjrryzwuojkmzhnwbsvveddjzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050577.3553653-311-151671364486172/AnsiballZ_getent.py'
Nov 25 06:02:57 compute-0 sudo[37700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:57 compute-0 python3.9[37702]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 25 06:02:57 compute-0 sudo[37700]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:57 compute-0 dnf[34540]: repo-setup-centos-powertools                     10 kB/s | 4.3 kB     00:00
Nov 25 06:02:57 compute-0 sudo[37855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlnhrganfnibmwszyejpcfihnuekwdmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050577.8132877-319-96806210782278/AnsiballZ_group.py'
Nov 25 06:02:57 compute-0 sudo[37855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:58 compute-0 python3.9[37857]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 06:02:58 compute-0 groupadd[37859]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 25 06:02:58 compute-0 groupadd[37859]: group added to /etc/gshadow: name=hugetlbfs
Nov 25 06:02:58 compute-0 groupadd[37859]: new group: name=hugetlbfs, GID=42477
Nov 25 06:02:58 compute-0 sudo[37855]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:58 compute-0 dnf[34540]: Extra Packages for Enterprise Linux 9 - x86_64   84 kB/s |  35 kB     00:00
Nov 25 06:02:58 compute-0 sudo[38014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtiwczaaxqhrenwqbrazuzsnqqjxqpmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050578.3247802-328-226394395194598/AnsiballZ_file.py'
Nov 25 06:02:58 compute-0 sudo[38014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:58 compute-0 python3.9[38016]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 25 06:02:58 compute-0 sudo[38014]: pam_unix(sudo:session): session closed for user root
Nov 25 06:02:58 compute-0 dnf[34540]: Metadata cache created.
Nov 25 06:02:58 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 25 06:02:58 compute-0 systemd[1]: Finished dnf makecache.
Nov 25 06:02:58 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.361s CPU time.
Nov 25 06:02:59 compute-0 sudo[38167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trjozzppwjzddlhrmsofohsllipdwlzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050578.910827-339-105107705847447/AnsiballZ_dnf.py'
Nov 25 06:02:59 compute-0 sudo[38167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:02:59 compute-0 python3.9[38169]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:03:00 compute-0 sudo[38167]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:00 compute-0 sudo[38320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yekkyidnaqingprkqjcxegtxjkevnrta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050580.6483335-347-58549388463610/AnsiballZ_file.py'
Nov 25 06:03:00 compute-0 sudo[38320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:00 compute-0 python3.9[38322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:00 compute-0 sudo[38320]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:01 compute-0 sudo[38472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpwwuvxtnufbgmpxpzlnsipwtbzaygam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050581.1075883-355-13592546095707/AnsiballZ_stat.py'
Nov 25 06:03:01 compute-0 sudo[38472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:01 compute-0 python3.9[38474]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:03:01 compute-0 sudo[38472]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:01 compute-0 sudo[38595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orabzgpvieeeogwhnjtmhpcqurkwsxrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050581.1075883-355-13592546095707/AnsiballZ_copy.py'
Nov 25 06:03:01 compute-0 sudo[38595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:01 compute-0 python3.9[38597]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050581.1075883-355-13592546095707/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:01 compute-0 sudo[38595]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:02 compute-0 sudo[38747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mulamqgtimqrcpuicalicfbmcycffylz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050581.9456205-370-78983671399174/AnsiballZ_systemd.py'
Nov 25 06:03:02 compute-0 sudo[38747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:02 compute-0 python3.9[38749]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:03:02 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 25 06:03:02 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 25 06:03:02 compute-0 kernel: Bridge firewalling registered
Nov 25 06:03:02 compute-0 systemd-modules-load[38753]: Inserted module 'br_netfilter'
Nov 25 06:03:02 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 25 06:03:02 compute-0 sudo[38747]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:03 compute-0 sudo[38906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhcwfknwprjtvkjnwavslkkbnquxixoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050582.8812633-378-52595048591648/AnsiballZ_stat.py'
Nov 25 06:03:03 compute-0 sudo[38906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:03 compute-0 python3.9[38908]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:03:03 compute-0 sudo[38906]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:03 compute-0 sudo[39029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfsleuljjrqfjftrbeoskpshvdbeiqbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050582.8812633-378-52595048591648/AnsiballZ_copy.py'
Nov 25 06:03:03 compute-0 sudo[39029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:03 compute-0 python3.9[39031]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050582.8812633-378-52595048591648/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:03 compute-0 sudo[39029]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:04 compute-0 sudo[39181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqnwxuotdbkgzapondadymhqxfvgnntr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050583.8592951-396-200937631817543/AnsiballZ_dnf.py'
Nov 25 06:03:04 compute-0 sudo[39181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:04 compute-0 python3.9[39183]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:03:10 compute-0 dbus-broker-launch[713]: Noticed file-system modification, trigger reload.
Nov 25 06:03:10 compute-0 dbus-broker-launch[713]: Noticed file-system modification, trigger reload.
Nov 25 06:03:10 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 06:03:10 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 06:03:10 compute-0 systemd[1]: Reloading.
Nov 25 06:03:10 compute-0 systemd-rc-local-generator[39241]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:03:10 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 06:03:11 compute-0 sudo[39181]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:11 compute-0 python3.9[40419]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:03:12 compute-0 python3.9[41449]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 25 06:03:12 compute-0 python3.9[42272]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:03:13 compute-0 sudo[43267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unxvbzsqfzqzfszmhopscxtxdshlbrrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050592.9730227-435-212806330293812/AnsiballZ_command.py'
Nov 25 06:03:13 compute-0 sudo[43267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 06:03:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 06:03:13 compute-0 systemd[1]: man-db-cache-update.service: Consumed 3.261s CPU time.
Nov 25 06:03:13 compute-0 systemd[1]: run-r7a1fb2bd41e946caabb5a9f94ecdfdc8.service: Deactivated successfully.
Nov 25 06:03:13 compute-0 python3.9[43290]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:03:13 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 25 06:03:13 compute-0 systemd[1]: Starting Authorization Manager...
Nov 25 06:03:13 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 25 06:03:13 compute-0 polkitd[43570]: Started polkitd version 0.117
Nov 25 06:03:13 compute-0 polkitd[43570]: Loading rules from directory /etc/polkit-1/rules.d
Nov 25 06:03:13 compute-0 polkitd[43570]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 25 06:03:13 compute-0 polkitd[43570]: Finished loading, compiling and executing 2 rules
Nov 25 06:03:13 compute-0 systemd[1]: Started Authorization Manager.
Nov 25 06:03:13 compute-0 polkitd[43570]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 25 06:03:13 compute-0 sudo[43267]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:14 compute-0 sudo[43734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udzseoicdudhvltrwbyfbmhqsbqpcnyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050593.9594314-444-181920463255467/AnsiballZ_systemd.py'
Nov 25 06:03:14 compute-0 sudo[43734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:14 compute-0 python3.9[43736]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:03:14 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 25 06:03:14 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 25 06:03:14 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 25 06:03:14 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 25 06:03:14 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 25 06:03:14 compute-0 sudo[43734]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:15 compute-0 python3.9[43898]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 25 06:03:16 compute-0 sudo[44048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psvjzmdrrunfafznhqkfdxsrzkhhifoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050596.6940682-501-89408073529506/AnsiballZ_systemd.py'
Nov 25 06:03:16 compute-0 sudo[44048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:17 compute-0 python3.9[44050]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:03:17 compute-0 systemd[1]: Reloading.
Nov 25 06:03:17 compute-0 systemd-rc-local-generator[44076]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:03:17 compute-0 sudo[44048]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:17 compute-0 sudo[44238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uytpgrzjeyxxpiaovvjoupohhloadszc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050597.4316392-501-193854282494065/AnsiballZ_systemd.py'
Nov 25 06:03:17 compute-0 sudo[44238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:17 compute-0 python3.9[44240]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:03:17 compute-0 systemd[1]: Reloading.
Nov 25 06:03:17 compute-0 systemd-rc-local-generator[44263]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:03:18 compute-0 sudo[44238]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:18 compute-0 sudo[44426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjvmtjiwxhjvgeqklelbftieokxphpct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050598.2353787-517-92766820581197/AnsiballZ_command.py'
Nov 25 06:03:18 compute-0 sudo[44426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:18 compute-0 python3.9[44428]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:03:18 compute-0 sudo[44426]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:18 compute-0 sudo[44579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrbeuivnyecitpjazumxlqkathpmknix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050598.6997862-525-140393965788835/AnsiballZ_command.py'
Nov 25 06:03:18 compute-0 sudo[44579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:19 compute-0 python3.9[44581]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:03:19 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 25 06:03:19 compute-0 sudo[44579]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:19 compute-0 sudo[44732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvksqioftifcrmocbludkuoigdyqvtpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050599.1515546-533-252062420112198/AnsiballZ_command.py'
Nov 25 06:03:19 compute-0 sudo[44732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:19 compute-0 python3.9[44734]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:03:20 compute-0 sudo[44732]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:20 compute-0 sudo[44894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjugpxphivyaifafyfzbzngdvuqqdczw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050600.6080055-541-220992725805675/AnsiballZ_command.py'
Nov 25 06:03:20 compute-0 sudo[44894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:20 compute-0 python3.9[44896]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:03:20 compute-0 sudo[44894]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:21 compute-0 sudo[45047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvotdwpjwzvokhlvmgmszqcojrncthsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050601.0499134-549-90433595840816/AnsiballZ_systemd.py'
Nov 25 06:03:21 compute-0 sudo[45047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:21 compute-0 python3.9[45049]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:03:21 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 25 06:03:21 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 25 06:03:21 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 25 06:03:21 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 25 06:03:21 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 25 06:03:21 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 25 06:03:21 compute-0 sudo[45047]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:21 compute-0 sshd-session[31440]: Connection closed by 192.168.122.30 port 46902
Nov 25 06:03:21 compute-0 sshd-session[31437]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:03:21 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 25 06:03:21 compute-0 systemd[1]: session-9.scope: Consumed 1min 40.059s CPU time.
Nov 25 06:03:21 compute-0 systemd-logind[744]: Session 9 logged out. Waiting for processes to exit.
Nov 25 06:03:21 compute-0 systemd-logind[744]: Removed session 9.
Nov 25 06:03:26 compute-0 sshd-session[45080]: Accepted publickey for zuul from 192.168.122.30 port 44002 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:03:26 compute-0 systemd-logind[744]: New session 10 of user zuul.
Nov 25 06:03:26 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 25 06:03:26 compute-0 sshd-session[45080]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:03:27 compute-0 python3.9[45233]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:03:28 compute-0 python3.9[45387]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:03:29 compute-0 sudo[45541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evwshmadkclpmqdzrvdydpendnvpnzab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050609.0026925-50-183303257520493/AnsiballZ_command.py'
Nov 25 06:03:29 compute-0 sudo[45541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:29 compute-0 python3.9[45543]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:03:29 compute-0 sudo[45541]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:30 compute-0 python3.9[45694]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:03:30 compute-0 sudo[45848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lubeemdusnqgatesihibaynoesvsqekq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050610.4492304-70-93324485565019/AnsiballZ_setup.py'
Nov 25 06:03:30 compute-0 sudo[45848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:30 compute-0 python3.9[45850]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:03:31 compute-0 sudo[45848]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:31 compute-0 sudo[45932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urwrosbkgdkpkvxnfeabwbfvwlkvqjor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050610.4492304-70-93324485565019/AnsiballZ_dnf.py'
Nov 25 06:03:31 compute-0 sudo[45932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:31 compute-0 python3.9[45934]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:03:32 compute-0 sudo[45932]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:32 compute-0 sudo[46085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wndmormngkgwjeaaeqzqjgbxciqxcjgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050612.5724287-82-182041534124920/AnsiballZ_setup.py'
Nov 25 06:03:32 compute-0 sudo[46085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:32 compute-0 python3.9[46087]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:03:33 compute-0 sudo[46085]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:33 compute-0 sudo[46256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djbczojvieqwxtzjthifrytnjklqvonx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050613.2797995-93-232699991540319/AnsiballZ_file.py'
Nov 25 06:03:33 compute-0 sudo[46256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:33 compute-0 python3.9[46258]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:03:33 compute-0 sudo[46256]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:34 compute-0 sudo[46408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbwaffclglmjbcdukrigujqskzgggbcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050613.875271-101-20138919912481/AnsiballZ_command.py'
Nov 25 06:03:34 compute-0 sudo[46408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:34 compute-0 python3.9[46410]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:03:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1280126576-merged.mount: Deactivated successfully.
Nov 25 06:03:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1858346938-merged.mount: Deactivated successfully.
Nov 25 06:03:34 compute-0 podman[46411]: 2025-11-25 06:03:34.2435238 +0000 UTC m=+0.035951816 system refresh
Nov 25 06:03:34 compute-0 sudo[46408]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:34 compute-0 sudo[46569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bodbhzvkfpytqxudyviaibjuavsfeuat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050614.3796844-109-239676111572232/AnsiballZ_stat.py'
Nov 25 06:03:34 compute-0 sudo[46569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:34 compute-0 python3.9[46571]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:03:34 compute-0 sudo[46569]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:35 compute-0 sudo[46692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycytxbergdtcuvjisyebagqvdhqqfmsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050614.3796844-109-239676111572232/AnsiballZ_copy.py'
Nov 25 06:03:35 compute-0 sudo[46692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:03:35 compute-0 python3.9[46694]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050614.3796844-109-239676111572232/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9265b5381ba320fc51282fd662858987fece8be6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:03:35 compute-0 sudo[46692]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:35 compute-0 sudo[46844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hivjchopgwqutsvdgctxviuvdgxoeifs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050615.4650424-124-21551342874974/AnsiballZ_stat.py'
Nov 25 06:03:35 compute-0 sudo[46844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:35 compute-0 python3.9[46846]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:03:35 compute-0 sudo[46844]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:36 compute-0 sudo[46967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bigfzlwnfhzhxzkuqehoepfvctrrrfty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050615.4650424-124-21551342874974/AnsiballZ_copy.py'
Nov 25 06:03:36 compute-0 sudo[46967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:36 compute-0 python3.9[46969]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050615.4650424-124-21551342874974/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:36 compute-0 sudo[46967]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:36 compute-0 sudo[47119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhdiyqpofnjlvgorpfzdjyhkxkxftyhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050616.333917-140-63799851172671/AnsiballZ_ini_file.py'
Nov 25 06:03:36 compute-0 sudo[47119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:36 compute-0 python3.9[47121]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:36 compute-0 sudo[47119]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:37 compute-0 sudo[47271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efvbvodbwdwindctoejqegextabpvnzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050616.907663-140-260414475905012/AnsiballZ_ini_file.py'
Nov 25 06:03:37 compute-0 sudo[47271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:37 compute-0 python3.9[47273]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:37 compute-0 sudo[47271]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:37 compute-0 sudo[47423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lchqbbfxprvtqadruoohasomflmiodpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050617.3279-140-67390209436264/AnsiballZ_ini_file.py'
Nov 25 06:03:37 compute-0 sudo[47423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:37 compute-0 python3.9[47425]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:37 compute-0 sudo[47423]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:37 compute-0 sudo[47575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zowdahlluvxsuzlrbodgobvkbgfoztwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050617.7436523-140-123700640435142/AnsiballZ_ini_file.py'
Nov 25 06:03:37 compute-0 sudo[47575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:38 compute-0 python3.9[47577]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:03:38 compute-0 sudo[47575]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:38 compute-0 python3.9[47727]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:03:39 compute-0 sudo[47879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zakovtabfsvglmggxldsikmfmlchoffs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050618.8635507-180-53352425640320/AnsiballZ_dnf.py'
Nov 25 06:03:39 compute-0 sudo[47879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:39 compute-0 python3.9[47881]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:03:40 compute-0 sudo[47879]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:40 compute-0 sudo[48032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lymkkxkbeinnjxgxrflbdcbffbbfuheb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050620.290052-188-245605920885518/AnsiballZ_dnf.py'
Nov 25 06:03:40 compute-0 sudo[48032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:40 compute-0 python3.9[48034]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:03:43 compute-0 sudo[48032]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:43 compute-0 sudo[48192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoigiygykfmdwrcdjyjkrweplvlwzrgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050623.728663-198-260915995129724/AnsiballZ_dnf.py'
Nov 25 06:03:43 compute-0 sudo[48192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:44 compute-0 python3.9[48194]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:03:45 compute-0 sudo[48192]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:45 compute-0 sudo[48345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spcieujyizyayturpjbgweuscapjjdkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050625.2332578-207-226836507232271/AnsiballZ_dnf.py'
Nov 25 06:03:45 compute-0 sudo[48345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:45 compute-0 python3.9[48347]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:03:46 compute-0 sudo[48345]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:46 compute-0 sudo[48498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdcaqtotsqtsvczzexssdjfvhbgxscxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050626.8283799-218-91250814877971/AnsiballZ_dnf.py'
Nov 25 06:03:47 compute-0 sudo[48498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:47 compute-0 python3.9[48500]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:03:49 compute-0 sudo[48498]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:49 compute-0 sudo[48654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzpxyqfdzjydrbptrfuwbohtcluolscj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050629.791274-226-244172154588332/AnsiballZ_dnf.py'
Nov 25 06:03:49 compute-0 sudo[48654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:50 compute-0 python3.9[48656]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:03:58 compute-0 sudo[48654]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:58 compute-0 sudo[48824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwqibbuvoetxsuueqfrrmpkszfskdbdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050638.2233448-235-242237338088328/AnsiballZ_dnf.py'
Nov 25 06:03:58 compute-0 sudo[48824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:03:58 compute-0 python3.9[48826]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:03:59 compute-0 sudo[48824]: pam_unix(sudo:session): session closed for user root
Nov 25 06:03:59 compute-0 sudo[48977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjqopihwfzpjfplypshwgqpsbadjiltb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050639.6822155-244-196271188431627/AnsiballZ_dnf.py'
Nov 25 06:03:59 compute-0 sudo[48977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:00 compute-0 python3.9[48979]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:04:14 compute-0 sudo[48977]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:15 compute-0 sudo[49314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feqeeebhlmadfqnipjxxaihaizkrglle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050654.9373412-253-89604447491235/AnsiballZ_dnf.py'
Nov 25 06:04:15 compute-0 sudo[49314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:15 compute-0 python3.9[49316]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:04:16 compute-0 sudo[49314]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:16 compute-0 sudo[49470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjzmokrkcfwpwvqxqhdikqscsigteazb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050656.5563076-264-13500478324248/AnsiballZ_file.py'
Nov 25 06:04:16 compute-0 sudo[49470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:16 compute-0 python3.9[49472]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:04:16 compute-0 sudo[49470]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:17 compute-0 sudo[49645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvdxuyzrgbgfqhjzknsbdpstxyiqzelc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050657.0406082-272-269391266505817/AnsiballZ_stat.py'
Nov 25 06:04:17 compute-0 sudo[49645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:17 compute-0 python3.9[49647]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:04:17 compute-0 sudo[49645]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:17 compute-0 sudo[49768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puaocczcxqkoilnzbpzvbdxkrpqjvcid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050657.0406082-272-269391266505817/AnsiballZ_copy.py'
Nov 25 06:04:17 compute-0 sudo[49768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:17 compute-0 python3.9[49770]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764050657.0406082-272-269391266505817/.source.json _original_basename=.a85alhii follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:04:17 compute-0 sudo[49768]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:18 compute-0 sudo[49920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqecjyrndeqryshdferzeoxippkxzcqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050658.0018404-290-103073594641440/AnsiballZ_podman_image.py'
Nov 25 06:04:18 compute-0 sudo[49920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:18 compute-0 python3.9[49922]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 06:04:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2257547525-merged.mount: Deactivated successfully.
Nov 25 06:04:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2257547525-lower\x2dmapped.mount: Deactivated successfully.
Nov 25 06:04:27 compute-0 podman[49933]: 2025-11-25 06:04:27.776277185 +0000 UTC m=+9.216598439 image pull fb385c849c98a3c678a3d627f4cb894eda21a9dce6ba3cc1ef408e332ab6bee7 quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:04:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:27 compute-0 sudo[49920]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:28 compute-0 sudo[50199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctfkoewojpnlzzbsewgpodkspzhnrbsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050668.1717985-301-1144136124500/AnsiballZ_podman_image.py'
Nov 25 06:04:28 compute-0 sudo[50199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:28 compute-0 python3.9[50201]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 06:04:39 compute-0 podman[50211]: 2025-11-25 06:04:39.581776808 +0000 UTC m=+11.037550434 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:04:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:39 compute-0 sudo[50199]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:40 compute-0 sudo[50503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjigftnwjvbnlrnupiyulxfshwnwekwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050679.9282777-311-72937687709618/AnsiballZ_podman_image.py'
Nov 25 06:04:40 compute-0 sudo[50503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:40 compute-0 python3.9[50505]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 06:04:41 compute-0 podman[50515]: 2025-11-25 06:04:41.982686863 +0000 UTC m=+1.674947292 image pull 828f38556716c2bbf53d759883b37dd33dbb0b3669db0223d51c04787010a74b quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:04:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:42 compute-0 sudo[50503]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:42 compute-0 sudo[50729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuzinbaqdevpnoxotknggahxmyxkhfpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050682.3171139-320-91305464721063/AnsiballZ_podman_image.py'
Nov 25 06:04:42 compute-0 sudo[50729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:42 compute-0 python3.9[50731]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 06:04:57 compute-0 podman[50741]: 2025-11-25 06:04:57.588672402 +0000 UTC m=+14.892404214 image pull b5a49b8af9b6d4308f9036b8ada850f2911f350781c3ddf60dd55cecb3543ff2 quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:04:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:04:57 compute-0 sudo[50729]: pam_unix(sudo:session): session closed for user root
Nov 25 06:04:58 compute-0 sudo[50974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sguulhhgvutwrdpgeddzkiblyqaobklv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050698.047728-331-227642156370346/AnsiballZ_podman_image.py'
Nov 25 06:04:58 compute-0 sudo[50974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:04:58 compute-0 python3.9[50976]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 06:05:03 compute-0 podman[50986]: 2025-11-25 06:05:03.670471038 +0000 UTC m=+5.247219902 image pull 884992811c8175ee05276a13464176221fd628ef0f4b26c22d3021b5f1aa08da quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:05:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:05:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:05:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:05:03 compute-0 sudo[50974]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:04 compute-0 sudo[51217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thcogmdgpfozgvkwqnrepjgdqffwrvca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050703.9021642-331-17427495374302/AnsiballZ_podman_image.py'
Nov 25 06:05:04 compute-0 sudo[51217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:04 compute-0 python3.9[51219]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 06:05:06 compute-0 podman[51230]: 2025-11-25 06:05:06.048213428 +0000 UTC m=+1.791264491 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 25 06:05:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:05:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:05:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:05:06 compute-0 sudo[51217]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:06 compute-0 sshd-session[45083]: Connection closed by 192.168.122.30 port 44002
Nov 25 06:05:06 compute-0 sshd-session[45080]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:05:06 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 25 06:05:06 compute-0 systemd[1]: session-10.scope: Consumed 1min 34.565s CPU time.
Nov 25 06:05:06 compute-0 systemd-logind[744]: Session 10 logged out. Waiting for processes to exit.
Nov 25 06:05:06 compute-0 systemd-logind[744]: Removed session 10.
Nov 25 06:05:12 compute-0 sshd-session[51351]: Accepted publickey for zuul from 192.168.122.30 port 45108 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:05:12 compute-0 systemd-logind[744]: New session 11 of user zuul.
Nov 25 06:05:12 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 25 06:05:12 compute-0 sshd-session[51351]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:05:12 compute-0 python3.9[51504]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:05:13 compute-0 sudo[51658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhaprmmyeukbtnvwzlwzgwgnfoxlywxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050713.1929197-36-171078205065984/AnsiballZ_getent.py'
Nov 25 06:05:13 compute-0 sudo[51658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:13 compute-0 python3.9[51660]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 25 06:05:13 compute-0 sudo[51658]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:14 compute-0 sudo[51811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfscxnonzdcrrmfritikqreedfujdtve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050713.7531374-44-201498985090378/AnsiballZ_group.py'
Nov 25 06:05:14 compute-0 sudo[51811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:14 compute-0 python3.9[51813]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 06:05:14 compute-0 groupadd[51814]: group added to /etc/group: name=openvswitch, GID=42476
Nov 25 06:05:14 compute-0 groupadd[51814]: group added to /etc/gshadow: name=openvswitch
Nov 25 06:05:14 compute-0 groupadd[51814]: new group: name=openvswitch, GID=42476
Nov 25 06:05:14 compute-0 sudo[51811]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:14 compute-0 sudo[51969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qygznvcshmxritjdiiidxlukqwazxqtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050714.3304708-52-23280397285466/AnsiballZ_user.py'
Nov 25 06:05:14 compute-0 sudo[51969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:14 compute-0 python3.9[51971]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 06:05:14 compute-0 useradd[51973]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 06:05:14 compute-0 useradd[51973]: add 'openvswitch' to group 'hugetlbfs'
Nov 25 06:05:14 compute-0 useradd[51973]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 25 06:05:14 compute-0 sudo[51969]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:15 compute-0 sudo[52129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-augokgrrzdousxcnakhipellfseowymk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050715.0306003-62-138786581359456/AnsiballZ_setup.py'
Nov 25 06:05:15 compute-0 sudo[52129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:15 compute-0 python3.9[52131]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:05:15 compute-0 sudo[52129]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:15 compute-0 sudo[52213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zclrdxcvwxitbrlmaeuaihqomsrkxrmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050715.0306003-62-138786581359456/AnsiballZ_dnf.py'
Nov 25 06:05:15 compute-0 sudo[52213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:16 compute-0 python3.9[52215]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:05:17 compute-0 sudo[52213]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:18 compute-0 sudo[52375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmglitddwsksqybycatfiuannxtxpdms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050718.016463-76-82493009482115/AnsiballZ_dnf.py'
Nov 25 06:05:18 compute-0 sudo[52375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:18 compute-0 python3.9[52377]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:05:26 compute-0 kernel: SELinux:  Converting 2730 SID table entries...
Nov 25 06:05:26 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 06:05:26 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 06:05:26 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 06:05:26 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 06:05:26 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 06:05:26 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 06:05:26 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 06:05:26 compute-0 groupadd[52400]: group added to /etc/group: name=unbound, GID=993
Nov 25 06:05:26 compute-0 groupadd[52400]: group added to /etc/gshadow: name=unbound
Nov 25 06:05:26 compute-0 groupadd[52400]: new group: name=unbound, GID=993
Nov 25 06:05:26 compute-0 useradd[52407]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 25 06:05:26 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 25 06:05:26 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 25 06:05:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 06:05:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 06:05:27 compute-0 systemd[1]: Reloading.
Nov 25 06:05:27 compute-0 systemd-sysv-generator[52904]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:05:27 compute-0 systemd-rc-local-generator[52898]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:05:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 06:05:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 06:05:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 06:05:28 compute-0 systemd[1]: run-r5f10bd0da92d4026a76450a54a16ed67.service: Deactivated successfully.
Nov 25 06:05:28 compute-0 sudo[52375]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:28 compute-0 sudo[53473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrydpfbrdmorexvzrdenjxxsguqkbsmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050728.3407996-84-227142088160581/AnsiballZ_systemd.py'
Nov 25 06:05:28 compute-0 sudo[53473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:28 compute-0 python3.9[53475]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:05:29 compute-0 systemd[1]: Reloading.
Nov 25 06:05:29 compute-0 systemd-sysv-generator[53507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:05:29 compute-0 systemd-rc-local-generator[53504]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:05:29 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 25 06:05:29 compute-0 chown[53517]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 25 06:05:29 compute-0 ovs-ctl[53522]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 25 06:05:29 compute-0 ovs-ctl[53522]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 25 06:05:29 compute-0 ovs-ctl[53522]: Starting ovsdb-server [  OK  ]
Nov 25 06:05:29 compute-0 ovs-vsctl[53571]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 25 06:05:29 compute-0 ovs-vsctl[53591]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"afd6e104-36fa-47e5-ae59-019941e8d117\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 25 06:05:29 compute-0 ovs-ctl[53522]: Configuring Open vSwitch system IDs [  OK  ]
Nov 25 06:05:29 compute-0 ovs-ctl[53522]: Enabling remote OVSDB managers [  OK  ]
Nov 25 06:05:29 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 25 06:05:29 compute-0 ovs-vsctl[53597]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 25 06:05:29 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 25 06:05:29 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 25 06:05:29 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 25 06:05:29 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 25 06:05:29 compute-0 ovs-ctl[53641]: Inserting openvswitch module [  OK  ]
Nov 25 06:05:29 compute-0 ovs-ctl[53610]: Starting ovs-vswitchd [  OK  ]
Nov 25 06:05:29 compute-0 ovs-ctl[53610]: Enabling remote OVSDB managers [  OK  ]
Nov 25 06:05:29 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 25 06:05:29 compute-0 ovs-vsctl[53659]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 25 06:05:29 compute-0 systemd[1]: Starting Open vSwitch...
Nov 25 06:05:29 compute-0 systemd[1]: Finished Open vSwitch.
Nov 25 06:05:29 compute-0 sudo[53473]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:30 compute-0 python3.9[53810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:05:30 compute-0 sudo[53960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtiwbtkbroujnqzjbcwxgyydrevragvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050730.2984133-102-49701614676289/AnsiballZ_sefcontext.py'
Nov 25 06:05:30 compute-0 sudo[53960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:30 compute-0 python3.9[53962]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 25 06:05:31 compute-0 kernel: SELinux:  Converting 2744 SID table entries...
Nov 25 06:05:31 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 06:05:31 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 06:05:31 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 06:05:31 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 06:05:31 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 06:05:31 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 06:05:31 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 06:05:31 compute-0 sudo[53960]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:32 compute-0 python3.9[54117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:05:32 compute-0 sudo[54273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylzubfaihvirpdyhoveoxyjfpowinzaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050732.5196998-120-109381742707073/AnsiballZ_dnf.py'
Nov 25 06:05:32 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 25 06:05:32 compute-0 sudo[54273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:32 compute-0 python3.9[54275]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:05:33 compute-0 sudo[54273]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:34 compute-0 sudo[54426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogsaqrhderplohetuodgghoraedagehr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050733.935404-128-134306923887218/AnsiballZ_command.py'
Nov 25 06:05:34 compute-0 sudo[54426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:34 compute-0 python3.9[54428]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:05:34 compute-0 sudo[54426]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:35 compute-0 sudo[54713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwnvfsujkcezbdagdfqcdljkcwpmqkwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050735.0043855-136-193838638788944/AnsiballZ_file.py'
Nov 25 06:05:35 compute-0 sudo[54713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:35 compute-0 python3.9[54715]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 06:05:35 compute-0 sudo[54713]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:35 compute-0 python3.9[54865]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:05:36 compute-0 sudo[55017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlalxacicylgbfdzltobepbhkvtegcdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050736.0750046-152-19146674242665/AnsiballZ_dnf.py'
Nov 25 06:05:36 compute-0 sudo[55017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:36 compute-0 python3.9[55019]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:05:39 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 06:05:39 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 06:05:39 compute-0 systemd[1]: Reloading.
Nov 25 06:05:39 compute-0 systemd-rc-local-generator[55057]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:05:39 compute-0 systemd-sysv-generator[55060]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:05:39 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 06:05:39 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 06:05:39 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 06:05:39 compute-0 systemd[1]: run-r1c91eadc4a0f48a7b9d185cd092f9a64.service: Deactivated successfully.
Nov 25 06:05:39 compute-0 sudo[55017]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:39 compute-0 sudo[55334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjdiuhjnyfskqfdudwumnzgfuddridsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050739.6862326-160-279518100898178/AnsiballZ_systemd.py'
Nov 25 06:05:39 compute-0 sudo[55334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:40 compute-0 python3.9[55336]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:05:40 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 25 06:05:40 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 25 06:05:40 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1106] caught SIGTERM, shutting down normally.
Nov 25 06:05:40 compute-0 systemd[1]: Stopping Network Manager...
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1113] dhcp4 (eth0): canceled DHCP transaction
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1114] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1114] dhcp4 (eth0): state changed no lease
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1115] dhcp6 (eth0): canceled DHCP transaction
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1115] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1115] dhcp6 (eth0): state changed no lease
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1116] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 06:05:40 compute-0 NetworkManager[7248]: <info>  [1764050740.1143] exiting (success)
Nov 25 06:05:40 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 06:05:40 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 06:05:40 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 25 06:05:40 compute-0 systemd[1]: Stopped Network Manager.
Nov 25 06:05:40 compute-0 systemd[1]: Starting Network Manager...
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.1590] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:724c5288-952e-4df2-81c9-c4395c2e16fd)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.1591] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.1629] manager[0x56045d058090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 06:05:40 compute-0 systemd[1]: Starting Hostname Service...
Nov 25 06:05:40 compute-0 systemd[1]: Started Hostname Service.
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2167] hostname: hostname: using hostnamed
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2168] hostname: static hostname changed from (none) to "compute-0"
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2170] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2172] manager[0x56045d058090]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2173] manager[0x56045d058090]: rfkill: WWAN hardware radio set enabled
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2186] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2192] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2192] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2193] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2193] manager: Networking is enabled by state file
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2194] settings: Loaded settings plugin: keyfile (internal)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2197] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2213] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2218] dhcp: init: Using DHCP client 'internal'
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2219] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2222] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2225] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2229] device (lo): Activation: starting connection 'lo' (38cce539-3d4c-4266-b4bb-4a3c7b88c026)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2233] device (eth0): carrier: link connected
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2236] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2239] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2239] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2242] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2246] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2249] device (eth1): carrier: link connected
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2252] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2254] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4) (indicated)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2255] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2258] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2261] device (eth1): Activation: starting connection 'ci-private-network' (4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2265] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 06:05:40 compute-0 systemd[1]: Started Network Manager.
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2268] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2270] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2271] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2278] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2279] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2281] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2282] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2283] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2287] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2288] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2290] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2294] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2297] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2301] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2316] dhcp4 (eth0): state changed new lease, address=192.168.26.115
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2322] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 06:05:40 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2351] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2352] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2353] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2357] device (lo): Activation: successful, device activated.
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2359] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2361] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 25 06:05:40 compute-0 NetworkManager[55345]: <info>  [1764050740.2363] device (eth1): Activation: successful, device activated.
Nov 25 06:05:40 compute-0 sudo[55334]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:40 compute-0 sudo[55543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgrhkreyzycrhvrlcnfoleczprsiwpbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050740.3800266-168-8129432181622/AnsiballZ_dnf.py'
Nov 25 06:05:40 compute-0 sudo[55543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:40 compute-0 python3.9[55545]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.2975] dhcp6 (eth0): state changed new lease, address=2001:db8::331
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.2987] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.3011] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.3012] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.3015] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.3017] device (eth0): Activation: successful, device activated.
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.3019] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 06:05:41 compute-0 NetworkManager[55345]: <info>  [1764050741.3021] manager: startup complete
Nov 25 06:05:41 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 25 06:05:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 06:05:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 06:05:46 compute-0 systemd[1]: Reloading.
Nov 25 06:05:46 compute-0 systemd-rc-local-generator[55610]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:05:46 compute-0 systemd-sysv-generator[55613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:05:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 06:05:46 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 06:05:46 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 06:05:46 compute-0 systemd[1]: run-r7e14ebf9888943689d37a52f03da7b81.service: Deactivated successfully.
Nov 25 06:05:47 compute-0 sudo[55543]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:47 compute-0 sudo[56021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvgmxsprimywurwqahhcsxxcexsgsdtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050747.3822525-180-59866788767556/AnsiballZ_stat.py'
Nov 25 06:05:47 compute-0 sudo[56021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:47 compute-0 python3.9[56023]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:05:47 compute-0 sudo[56021]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:48 compute-0 sudo[56173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbrttgtmknruxbbqaiocrgqrxgxrednk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050747.846397-189-69823003338404/AnsiballZ_ini_file.py'
Nov 25 06:05:48 compute-0 sudo[56173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:48 compute-0 python3.9[56175]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:48 compute-0 sudo[56173]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:48 compute-0 sudo[56327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkgsijcdbjvfmegstksmreqlvwhepxbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050748.5067492-199-234071984510016/AnsiballZ_ini_file.py'
Nov 25 06:05:48 compute-0 sudo[56327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:48 compute-0 python3.9[56329]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:48 compute-0 sudo[56327]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:49 compute-0 sudo[56479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywqdsgvlnbqyqrxfwxsbnyphchvexrez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050748.947506-199-203422443881959/AnsiballZ_ini_file.py'
Nov 25 06:05:49 compute-0 sudo[56479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:49 compute-0 python3.9[56481]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:49 compute-0 sudo[56479]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:49 compute-0 sudo[56633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwafhowlptoplohlcmgfffofvvddzuah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050749.4229858-214-57379835727795/AnsiballZ_ini_file.py'
Nov 25 06:05:49 compute-0 sudo[56633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:49 compute-0 python3.9[56635]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:49 compute-0 sudo[56633]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:50 compute-0 sudo[56785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngpuohaxbrosmgigahrdcskifhudzjxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050749.8516743-214-105279268795476/AnsiballZ_ini_file.py'
Nov 25 06:05:50 compute-0 sudo[56785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:50 compute-0 python3.9[56787]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:50 compute-0 sudo[56785]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:50 compute-0 sudo[56937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aabscimkvdeyyglsckigxyyrhjotdwrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050750.3116531-229-822490658964/AnsiballZ_stat.py'
Nov 25 06:05:50 compute-0 sudo[56937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:50 compute-0 python3.9[56939]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:05:50 compute-0 sudo[56937]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:51 compute-0 sudo[57060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfwnynsakhkodjdpkguttmehzxoiwnur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050750.3116531-229-822490658964/AnsiballZ_copy.py'
Nov 25 06:05:51 compute-0 sudo[57060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:51 compute-0 python3.9[57062]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050750.3116531-229-822490658964/.source _original_basename=.uqmjith3 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:51 compute-0 sudo[57060]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:51 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 06:05:51 compute-0 sudo[57212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzppyngltvnsrpetkbftshqmpcdsfyje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050751.3178282-244-59620120037318/AnsiballZ_file.py'
Nov 25 06:05:51 compute-0 sudo[57212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:51 compute-0 python3.9[57214]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:51 compute-0 sudo[57212]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:52 compute-0 sudo[57364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwxtvxdcqciapitmyjwldikbzrkzdoex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050751.781336-252-150735170677931/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 25 06:05:52 compute-0 sudo[57364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:52 compute-0 python3.9[57366]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 25 06:05:52 compute-0 sudo[57364]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:52 compute-0 sudo[57516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efxdjjrwkeuddtnwweowskhznndmdsvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050752.4609582-261-74376877902204/AnsiballZ_file.py'
Nov 25 06:05:52 compute-0 sudo[57516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:52 compute-0 python3.9[57518]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:05:52 compute-0 sudo[57516]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:53 compute-0 sudo[57668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omvvfhvqwelrpajjzqjmcupilmykyfom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050753.0711582-271-266625440919995/AnsiballZ_stat.py'
Nov 25 06:05:53 compute-0 sudo[57668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:53 compute-0 sudo[57668]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:53 compute-0 sudo[57791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyvvasiidzlzhurczfscklykozhelnlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050753.0711582-271-266625440919995/AnsiballZ_copy.py'
Nov 25 06:05:53 compute-0 sudo[57791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:53 compute-0 sudo[57791]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:54 compute-0 sudo[57943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jliygwzxaubtfsgjfuapitallyczsehz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050753.9039736-286-114214312052957/AnsiballZ_slurp.py'
Nov 25 06:05:54 compute-0 sudo[57943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:54 compute-0 python3.9[57945]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 25 06:05:54 compute-0 sudo[57943]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:55 compute-0 sudo[58118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzmpevzsnddryjmxcpfjwtojepyyalpz ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050754.5137143-295-278046977052350/async_wrapper.py j953753991117 300 /home/zuul/.ansible/tmp/ansible-tmp-1764050754.5137143-295-278046977052350/AnsiballZ_edpm_os_net_config.py _'
Nov 25 06:05:55 compute-0 sudo[58118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:55 compute-0 ansible-async_wrapper.py[58120]: Invoked with j953753991117 300 /home/zuul/.ansible/tmp/ansible-tmp-1764050754.5137143-295-278046977052350/AnsiballZ_edpm_os_net_config.py _
Nov 25 06:05:55 compute-0 ansible-async_wrapper.py[58123]: Starting module and watcher
Nov 25 06:05:55 compute-0 ansible-async_wrapper.py[58123]: Start watching 58124 (300)
Nov 25 06:05:55 compute-0 ansible-async_wrapper.py[58124]: Start module (58124)
Nov 25 06:05:55 compute-0 ansible-async_wrapper.py[58120]: Return async_wrapper task started.
Nov 25 06:05:55 compute-0 sudo[58118]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:55 compute-0 python3.9[58125]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 25 06:05:55 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 25 06:05:55 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 25 06:05:55 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 25 06:05:55 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 25 06:05:55 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5489] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5507] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5899] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5900] audit: op="connection-add" uuid="32969572-36bd-4e09-a231-0c0330c07640" name="br-ex-br" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5912] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5913] audit: op="connection-add" uuid="dd52bf4a-9a41-44b8-bd95-ac5af1eae4a8" name="br-ex-port" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5923] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5925] audit: op="connection-add" uuid="4504bb58-ccb6-4344-8578-4be4587b7fc1" name="eth1-port" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5934] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5935] audit: op="connection-add" uuid="44fdae4a-4736-4c0c-80c4-5a95fd31f7b7" name="vlan20-port" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5944] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5946] audit: op="connection-add" uuid="7ff9e859-6a08-4b65-9820-2e6b7dd273c7" name="vlan21-port" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5955] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5957] audit: op="connection-add" uuid="bb0b0201-506b-4842-be3b-97470244dc47" name="vlan22-port" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5973] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.may-fail,ipv6.routes,ipv6.method" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5986] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.5987] audit: op="connection-add" uuid="75ffb9af-127e-4c26-a2cc-73c03e8436ec" name="br-ex-if" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6008] audit: op="connection-update" uuid="4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,ipv4.dns,ipv4.addresses,ipv4.never-default,ipv4.routes,ipv4.method,ipv4.routing-rules,connection.port-type,connection.slave-type,connection.timestamp,connection.master,connection.controller,ipv6.dns,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routes,ipv6.method,ipv6.routing-rules" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6021] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6022] audit: op="connection-add" uuid="8b7e3857-ac1e-4a6c-a240-c461426bce7e" name="vlan20-if" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6034] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6035] audit: op="connection-add" uuid="7655cf78-ce74-4346-a7a3-cc4fe5f3cfb8" name="vlan21-if" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6047] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6048] audit: op="connection-add" uuid="674632a9-788e-434e-a706-5cdb2a5f74dd" name="vlan22-if" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6057] audit: op="connection-delete" uuid="a2275460-ad3e-3060-b256-d912ae2c7b1b" name="Wired connection 1" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6065] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6071] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6075] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (32969572-36bd-4e09-a231-0c0330c07640)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6075] audit: op="connection-activate" uuid="32969572-36bd-4e09-a231-0c0330c07640" name="br-ex-br" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6077] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6081] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6085] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (dd52bf4a-9a41-44b8-bd95-ac5af1eae4a8)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6086] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6090] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6093] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (4504bb58-ccb6-4344-8578-4be4587b7fc1)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6094] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6099] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6102] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (44fdae4a-4736-4c0c-80c4-5a95fd31f7b7)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6103] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6110] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6113] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (7ff9e859-6a08-4b65-9820-2e6b7dd273c7)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6115] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6119] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6123] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (bb0b0201-506b-4842-be3b-97470244dc47)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6124] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6126] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6127] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6132] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6136] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6139] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (75ffb9af-127e-4c26-a2cc-73c03e8436ec)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6140] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6142] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6144] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6145] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6146] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6158] device (eth1): disconnecting for new activation request.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6158] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6162] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6163] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6164] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6165] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6168] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6170] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8b7e3857-ac1e-4a6c-a240-c461426bce7e)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6170] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6172] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6173] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6174] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6176] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6178] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6180] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (7655cf78-ce74-4346-a7a3-cc4fe5f3cfb8)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6181] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6182] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6183] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6184] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6185] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6188] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6190] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (674632a9-788e-434e-a706-5cdb2a5f74dd)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6190] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6192] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6193] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6194] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6195] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6205] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.may-fail,ipv6.routes,ipv6.method" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6206] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6209] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6210] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6215] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6218] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 25 06:05:56 compute-0 kernel: Timeout policy base is empty
Nov 25 06:05:56 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 06:05:56 compute-0 systemd-udevd[58130]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6292] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6294] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6295] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6298] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6300] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6302] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6303] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6306] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6308] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6310] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6311] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6314] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6316] dhcp4 (eth0): canceled DHCP transaction
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6316] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6316] dhcp4 (eth0): state changed no lease
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6316] dhcp6 (eth0): canceled DHCP transaction
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6317] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6317] dhcp6 (eth0): state changed no lease
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6319] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 25 06:05:56 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6370] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6373] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58126 uid=0 result="fail" reason="Device is not activated"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6380] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6450] device (eth1): Activation: starting connection 'ci-private-network' (4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6453] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6455] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6458] dhcp4 (eth0): state changed new lease, address=192.168.26.115
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6468] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6470] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6475] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6478] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 kernel: br-ex: entered promiscuous mode
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6509] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6510] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6511] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6512] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6513] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6516] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6524] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6526] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6534] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6536] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6539] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6542] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6544] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6547] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6549] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6552] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6557] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6561] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6567] device (eth1): state change: ip-config -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6568] device (eth1)[Open vSwitch Port]: detaching ovs interface eth1
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6568] device (eth1): released from controller device eth1
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6573] device (eth1): disconnecting for new activation request.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6574] audit: op="connection-activate" uuid="4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4" name="ci-private-network" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 kernel: vlan22: entered promiscuous mode
Nov 25 06:05:56 compute-0 systemd-udevd[58131]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:05:56 compute-0 kernel: vlan21: entered promiscuous mode
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6654] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58126 uid=0 result="success"
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6657] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6657] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6664] device (eth1): Activation: starting connection 'ci-private-network' (4c59c9f9-c07e-57f4-9758-5e85b4fcf2c4)
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6675] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6677] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6686] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6694] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6699] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6704] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6719] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6727] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6728] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6729] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6734] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6740] device (eth1): Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6744] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 kernel: vlan20: entered promiscuous mode
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6765] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6773] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6775] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6779] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6786] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6793] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6837] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6843] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6853] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6896] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6914] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6933] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6937] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 06:05:56 compute-0 NetworkManager[55345]: <info>  [1764050756.6947] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 06:05:57 compute-0 NetworkManager[55345]: <info>  [1764050757.7862] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58126 uid=0 result="success"
Nov 25 06:05:57 compute-0 NetworkManager[55345]: <info>  [1764050757.8808] checkpoint[0x56045d030950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 25 06:05:57 compute-0 NetworkManager[55345]: <info>  [1764050757.8809] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58126 uid=0 result="success"
Nov 25 06:05:57 compute-0 NetworkManager[55345]: <info>  [1764050757.9784] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58126 uid=0 result="success"
Nov 25 06:05:57 compute-0 NetworkManager[55345]: <info>  [1764050757.9794] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.1193] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.2252] checkpoint[0x56045d030a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.2255] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.4295] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.4306] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.5794] audit: op="networking-control" arg="global-dns-configuration" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.5805] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf)
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.5809] audit: op="networking-control" arg="global-dns-configuration" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.5832] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 sudo[58463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrobibojnjqhjgdymuzkhvjnvswrzhhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050758.2759619-295-197192769324052/AnsiballZ_async_status.py'
Nov 25 06:05:58 compute-0 sudo[58463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.6942] checkpoint[0x56045d030af0]: destroy /org/freedesktop/NetworkManager/Checkpoint/3
Nov 25 06:05:58 compute-0 NetworkManager[55345]: <info>  [1764050758.6947] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58126 uid=0 result="success"
Nov 25 06:05:58 compute-0 ansible-async_wrapper.py[58124]: Module complete (58124)
Nov 25 06:05:58 compute-0 python3.9[58465]: ansible-ansible.legacy.async_status Invoked with jid=j953753991117.58120 mode=status _async_dir=/root/.ansible_async
Nov 25 06:05:58 compute-0 sudo[58463]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:58 compute-0 sudo[58562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oopapnnwsnymvcngdemlhrizvmmexfhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050758.2759619-295-197192769324052/AnsiballZ_async_status.py'
Nov 25 06:05:58 compute-0 sudo[58562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:59 compute-0 python3.9[58564]: ansible-ansible.legacy.async_status Invoked with jid=j953753991117.58120 mode=cleanup _async_dir=/root/.ansible_async
Nov 25 06:05:59 compute-0 sudo[58562]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:59 compute-0 sudo[58715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgjemdvuxkqrkfsshktioickvsknwcot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050759.3148336-317-189794215007154/AnsiballZ_stat.py'
Nov 25 06:05:59 compute-0 sudo[58715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:05:59 compute-0 python3.9[58717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:05:59 compute-0 sudo[58715]: pam_unix(sudo:session): session closed for user root
Nov 25 06:05:59 compute-0 sudo[58838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bneydbcmogyifggjsfsqomsgoqcsnejk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050759.3148336-317-189794215007154/AnsiballZ_copy.py'
Nov 25 06:05:59 compute-0 sudo[58838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:00 compute-0 python3.9[58840]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050759.3148336-317-189794215007154/.source.returncode _original_basename=.9q0_rbzl follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:00 compute-0 sudo[58838]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:00 compute-0 ansible-async_wrapper.py[58123]: Done in kid B.
Nov 25 06:06:00 compute-0 sudo[58990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywowfcxnublszkwrafbxjscfiwnmvkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050760.2092192-333-168377012997803/AnsiballZ_stat.py'
Nov 25 06:06:00 compute-0 sudo[58990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:00 compute-0 python3.9[58992]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:00 compute-0 sudo[58990]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:00 compute-0 sudo[59113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvzgfvywhkhxwvhwxyrdzctbhjccnicd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050760.2092192-333-168377012997803/AnsiballZ_copy.py'
Nov 25 06:06:00 compute-0 sudo[59113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:00 compute-0 python3.9[59115]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050760.2092192-333-168377012997803/.source.cfg _original_basename=.5r5_rp5e follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:00 compute-0 sudo[59113]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:01 compute-0 sudo[59265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmcmzvqclstgjgjfahujonrwhjhjogfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050761.0656102-348-128761088561430/AnsiballZ_systemd.py'
Nov 25 06:06:01 compute-0 sudo[59265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:01 compute-0 python3.9[59267]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:06:01 compute-0 systemd[1]: Reloading Network Manager...
Nov 25 06:06:01 compute-0 NetworkManager[55345]: <info>  [1764050761.5506] audit: op="reload" arg="0" pid=59271 uid=0 result="success"
Nov 25 06:06:01 compute-0 NetworkManager[55345]: <info>  [1764050761.5511] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 25 06:06:01 compute-0 NetworkManager[55345]: <info>  [1764050761.5512] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 25 06:06:01 compute-0 systemd[1]: Reloaded Network Manager.
Nov 25 06:06:01 compute-0 sudo[59265]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:01 compute-0 sshd-session[51354]: Connection closed by 192.168.122.30 port 45108
Nov 25 06:06:01 compute-0 sshd-session[51351]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:06:01 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 25 06:06:01 compute-0 systemd[1]: session-11.scope: Consumed 33.668s CPU time.
Nov 25 06:06:01 compute-0 systemd-logind[744]: Session 11 logged out. Waiting for processes to exit.
Nov 25 06:06:01 compute-0 systemd-logind[744]: Removed session 11.
Nov 25 06:06:06 compute-0 sshd-session[59302]: Accepted publickey for zuul from 192.168.122.30 port 41452 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:06:06 compute-0 systemd-logind[744]: New session 12 of user zuul.
Nov 25 06:06:06 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 25 06:06:06 compute-0 sshd-session[59302]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:06:07 compute-0 python3.9[59455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:06:08 compute-0 python3.9[59609]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:06:08 compute-0 python3.9[59799]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:06:09 compute-0 sshd-session[59305]: Connection closed by 192.168.122.30 port 41452
Nov 25 06:06:09 compute-0 sshd-session[59302]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:06:09 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 25 06:06:09 compute-0 systemd[1]: session-12.scope: Consumed 1.613s CPU time.
Nov 25 06:06:09 compute-0 systemd-logind[744]: Session 12 logged out. Waiting for processes to exit.
Nov 25 06:06:09 compute-0 systemd-logind[744]: Removed session 12.
Nov 25 06:06:10 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 06:06:11 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 06:06:14 compute-0 sshd-session[59828]: Accepted publickey for zuul from 192.168.122.30 port 38894 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:06:14 compute-0 systemd-logind[744]: New session 13 of user zuul.
Nov 25 06:06:14 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 25 06:06:14 compute-0 sshd-session[59828]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:06:15 compute-0 python3.9[59982]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:06:16 compute-0 python3.9[60136]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:06:16 compute-0 sudo[60290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqigwsedlqesxcqihcwjcgzzkavapmud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050776.485899-40-198369227082901/AnsiballZ_setup.py'
Nov 25 06:06:16 compute-0 sudo[60290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:16 compute-0 python3.9[60292]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:06:17 compute-0 sudo[60290]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:17 compute-0 sudo[60374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvikgpltixfgxftowfqarnkaejftagtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050776.485899-40-198369227082901/AnsiballZ_dnf.py'
Nov 25 06:06:17 compute-0 sudo[60374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:17 compute-0 python3.9[60376]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:06:18 compute-0 sudo[60374]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:18 compute-0 sudo[60528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnfxxzsinphqpftdecvjgmcjdvgcosqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050778.683786-52-145090902968081/AnsiballZ_setup.py'
Nov 25 06:06:18 compute-0 sudo[60528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:19 compute-0 python3.9[60530]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:06:19 compute-0 sudo[60528]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:19 compute-0 sudo[60719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzvniyjpyhdctzjqncsudkpsbhtxsgmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050779.4386468-63-259418319676867/AnsiballZ_file.py'
Nov 25 06:06:19 compute-0 sudo[60719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:19 compute-0 python3.9[60721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:19 compute-0 sudo[60719]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:20 compute-0 sudo[60871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-getnootcucuxrsznvvxvkxusobktzzjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050780.015284-71-207636527856331/AnsiballZ_command.py'
Nov 25 06:06:20 compute-0 sudo[60871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:20 compute-0 python3.9[60873]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:06:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:06:20 compute-0 sudo[60871]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:20 compute-0 sudo[61032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofvtdpsccyovxpjbqmvcvpkbvqxphtij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050780.6138198-79-43398165459981/AnsiballZ_stat.py'
Nov 25 06:06:20 compute-0 sudo[61032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:21 compute-0 python3.9[61034]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:21 compute-0 sudo[61032]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:21 compute-0 sudo[61110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fozevdqgrwkswjbicklkmpdxdzwpteoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050780.6138198-79-43398165459981/AnsiballZ_file.py'
Nov 25 06:06:21 compute-0 sudo[61110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:21 compute-0 python3.9[61112]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:21 compute-0 sudo[61110]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:21 compute-0 sudo[61263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shdjhjulrdlmediuvkdfkvyvwglmkbko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050781.5078351-91-163898137693040/AnsiballZ_stat.py'
Nov 25 06:06:21 compute-0 sudo[61263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:21 compute-0 python3.9[61265]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:21 compute-0 sudo[61263]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:22 compute-0 sudo[61341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkmkfssymyemyyytrgzvunxjfgwcyzpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050781.5078351-91-163898137693040/AnsiballZ_file.py'
Nov 25 06:06:22 compute-0 sudo[61341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:22 compute-0 python3.9[61343]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:22 compute-0 sudo[61341]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:22 compute-0 sudo[61493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbxohuzffvzmeymduxjahggboqhkbpzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050782.3234334-104-254257824881834/AnsiballZ_ini_file.py'
Nov 25 06:06:22 compute-0 sudo[61493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:22 compute-0 python3.9[61495]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:22 compute-0 sudo[61493]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:23 compute-0 sudo[61645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbpciuzigpgoluolqdzuvsehlaeixfrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050782.8711665-104-120012827332564/AnsiballZ_ini_file.py'
Nov 25 06:06:23 compute-0 sudo[61645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:23 compute-0 python3.9[61647]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:23 compute-0 sudo[61645]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:23 compute-0 sudo[61797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuuwzehzwkrinvcrkgrgsaifrwsqqlld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050783.2973351-104-181568158128850/AnsiballZ_ini_file.py'
Nov 25 06:06:23 compute-0 sudo[61797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:23 compute-0 python3.9[61799]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:23 compute-0 sudo[61797]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:23 compute-0 sudo[61949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvwccewszjsqtfcablenewbekbdlfgvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050783.7166264-104-252966930597512/AnsiballZ_ini_file.py'
Nov 25 06:06:23 compute-0 sudo[61949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:24 compute-0 python3.9[61951]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:24 compute-0 sudo[61949]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:24 compute-0 sudo[62101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owfjfgreaiiwtxlcviehswvqibgahfuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050784.2193127-135-126292067879201/AnsiballZ_dnf.py'
Nov 25 06:06:24 compute-0 sudo[62101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:24 compute-0 python3.9[62103]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:06:25 compute-0 sudo[62101]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:26 compute-0 sudo[62255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrdeuoxbgdwjgebkhsmdblljqzkhdvly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050785.8517904-146-244545848284621/AnsiballZ_setup.py'
Nov 25 06:06:26 compute-0 sudo[62255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:26 compute-0 python3.9[62257]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:06:26 compute-0 sudo[62255]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:26 compute-0 sudo[62409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyzorceezuutmtaaosewcrjvutcxjoyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050786.3983395-154-76375773063649/AnsiballZ_stat.py'
Nov 25 06:06:26 compute-0 sudo[62409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:26 compute-0 python3.9[62411]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:06:26 compute-0 sudo[62409]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:27 compute-0 sudo[62561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjzwarwwgtjzooqhrvsgwbrmgdawvzfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050786.8768396-163-276808358288554/AnsiballZ_stat.py'
Nov 25 06:06:27 compute-0 sudo[62561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:27 compute-0 python3.9[62563]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:06:27 compute-0 sudo[62561]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:27 compute-0 sudo[62713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvrmfuviolcckdjiixumhitrtwzbpqjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050787.4096756-173-106184834320586/AnsiballZ_command.py'
Nov 25 06:06:27 compute-0 sudo[62713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:27 compute-0 python3.9[62715]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:06:27 compute-0 sudo[62713]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:28 compute-0 sudo[62867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zthkpsqsoejjaxnypmaamnhqcpegusuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050787.943932-183-129211273064186/AnsiballZ_service_facts.py'
Nov 25 06:06:28 compute-0 sudo[62867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:28 compute-0 python3.9[62869]: ansible-service_facts Invoked
Nov 25 06:06:28 compute-0 network[62886]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 06:06:28 compute-0 network[62887]: 'network-scripts' will be removed from distribution in near future.
Nov 25 06:06:28 compute-0 network[62888]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 06:06:30 compute-0 sudo[62867]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:30 compute-0 sudo[63171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwydbvziitmptawqsumnvkvjvzceptcs ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764050790.6295924-198-114565567856170/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764050790.6295924-198-114565567856170/args'
Nov 25 06:06:30 compute-0 sudo[63171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:30 compute-0 sudo[63171]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:31 compute-0 sudo[63338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xougqbuglrjenhpitpntvucqspwiriio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050791.0737674-209-75473291471013/AnsiballZ_dnf.py'
Nov 25 06:06:31 compute-0 sudo[63338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:31 compute-0 python3.9[63340]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:06:32 compute-0 sudo[63338]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:33 compute-0 sudo[63491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdyesflsanfacbfyloksaevaqljjojpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050792.6271052-222-189942693559435/AnsiballZ_package_facts.py'
Nov 25 06:06:33 compute-0 sudo[63491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:33 compute-0 python3.9[63493]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 25 06:06:33 compute-0 sudo[63491]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:34 compute-0 sudo[63643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikruapslpdtpghkbzkjwbqqgjnjdijll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050793.824641-232-95142572340801/AnsiballZ_stat.py'
Nov 25 06:06:34 compute-0 sudo[63643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:34 compute-0 python3.9[63645]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:34 compute-0 sudo[63643]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:34 compute-0 sudo[63768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlitekeeeiipzwpgaxrazanqrxpztbbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050793.824641-232-95142572340801/AnsiballZ_copy.py'
Nov 25 06:06:34 compute-0 sudo[63768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:34 compute-0 python3.9[63770]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050793.824641-232-95142572340801/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:34 compute-0 sudo[63768]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:35 compute-0 sudo[63922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlxxcbsahccvxaafdgacwtsrzpbmkaze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050794.9740627-247-278011795742367/AnsiballZ_stat.py'
Nov 25 06:06:35 compute-0 sudo[63922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:35 compute-0 python3.9[63924]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:35 compute-0 sudo[63922]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:35 compute-0 sudo[64047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucxjtcsfiqzzshikuvpnytgfaazsgptq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050794.9740627-247-278011795742367/AnsiballZ_copy.py'
Nov 25 06:06:35 compute-0 sudo[64047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:35 compute-0 python3.9[64049]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050794.9740627-247-278011795742367/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:35 compute-0 sudo[64047]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:36 compute-0 sudo[64201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxiuhkgnklqecaheagvjvtdzhmljajvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050796.0842767-268-174394833686482/AnsiballZ_lineinfile.py'
Nov 25 06:06:36 compute-0 sudo[64201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:36 compute-0 python3.9[64203]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:36 compute-0 sudo[64201]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:37 compute-0 sudo[64355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjszaaqovjixlxggywzuapqtlystckay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050796.9954724-283-102339162501763/AnsiballZ_setup.py'
Nov 25 06:06:37 compute-0 sudo[64355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:37 compute-0 python3.9[64357]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:06:37 compute-0 sudo[64355]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:38 compute-0 sudo[64439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqffunktuegnrihmlumxibnpcozrnwrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050796.9954724-283-102339162501763/AnsiballZ_systemd.py'
Nov 25 06:06:38 compute-0 sudo[64439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:38 compute-0 python3.9[64441]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:06:38 compute-0 sudo[64439]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:38 compute-0 sudo[64593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tssciuhkopveepdektoxwqmiargzsezu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050798.6639562-299-42536607210782/AnsiballZ_setup.py'
Nov 25 06:06:38 compute-0 sudo[64593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:39 compute-0 python3.9[64595]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:06:39 compute-0 sudo[64593]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:39 compute-0 sudo[64677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oavwxurbwhrirlthkapvwgfmowpfnafl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050798.6639562-299-42536607210782/AnsiballZ_systemd.py'
Nov 25 06:06:39 compute-0 sudo[64677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:39 compute-0 python3.9[64679]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:06:39 compute-0 chronyd[753]: chronyd exiting
Nov 25 06:06:39 compute-0 systemd[1]: Stopping NTP client/server...
Nov 25 06:06:39 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 25 06:06:39 compute-0 systemd[1]: Stopped NTP client/server.
Nov 25 06:06:39 compute-0 systemd[1]: Starting NTP client/server...
Nov 25 06:06:39 compute-0 chronyd[64687]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 25 06:06:39 compute-0 chronyd[64687]: Frequency -10.128 +/- 1.091 ppm read from /var/lib/chrony/drift
Nov 25 06:06:39 compute-0 chronyd[64687]: Loaded seccomp filter (level 2)
Nov 25 06:06:39 compute-0 systemd[1]: Started NTP client/server.
Nov 25 06:06:39 compute-0 sudo[64677]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:40 compute-0 sshd-session[59831]: Connection closed by 192.168.122.30 port 38894
Nov 25 06:06:40 compute-0 sshd-session[59828]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:06:40 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 25 06:06:40 compute-0 systemd[1]: session-13.scope: Consumed 17.492s CPU time.
Nov 25 06:06:40 compute-0 systemd-logind[744]: Session 13 logged out. Waiting for processes to exit.
Nov 25 06:06:40 compute-0 systemd-logind[744]: Removed session 13.
Nov 25 06:06:44 compute-0 sshd-session[64713]: Accepted publickey for zuul from 192.168.122.30 port 54946 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:06:44 compute-0 systemd-logind[744]: New session 14 of user zuul.
Nov 25 06:06:44 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 25 06:06:44 compute-0 sshd-session[64713]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:06:45 compute-0 python3.9[64866]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:06:46 compute-0 sudo[65020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odlrfwgnqlyblazoetfezwfotmlhhtuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050806.1456015-33-264338841971161/AnsiballZ_file.py'
Nov 25 06:06:46 compute-0 sudo[65020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:46 compute-0 python3.9[65022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:46 compute-0 sudo[65020]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:47 compute-0 sudo[65195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqgseeviyeyzcdlvymmkokrsufamnhkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050806.7141376-41-151927097077769/AnsiballZ_stat.py'
Nov 25 06:06:47 compute-0 sudo[65195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:47 compute-0 python3.9[65197]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:47 compute-0 sudo[65195]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:47 compute-0 sudo[65273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbmdcppbgjsdlyddhjeysxfzgobqblez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050806.7141376-41-151927097077769/AnsiballZ_file.py'
Nov 25 06:06:47 compute-0 sudo[65273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:47 compute-0 python3.9[65275]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.ah4epo6l recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:47 compute-0 sudo[65273]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:47 compute-0 sudo[65425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfwiyafwvhkwfgddlnzdhgkujoaeiilo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050807.805556-61-95030976788463/AnsiballZ_stat.py'
Nov 25 06:06:47 compute-0 sudo[65425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:48 compute-0 python3.9[65427]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:48 compute-0 sudo[65425]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:48 compute-0 sudo[65548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voptkprlornlhqkeynxitoracnqjctxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050807.805556-61-95030976788463/AnsiballZ_copy.py'
Nov 25 06:06:48 compute-0 sudo[65548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:48 compute-0 python3.9[65550]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050807.805556-61-95030976788463/.source _original_basename=.zng1me9y follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:48 compute-0 sudo[65548]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:48 compute-0 sudo[65700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trikzgkeltxiipycfclqdmspodfeqthk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050808.8069456-77-260255502705204/AnsiballZ_file.py'
Nov 25 06:06:48 compute-0 sudo[65700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:49 compute-0 python3.9[65702]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:49 compute-0 sudo[65700]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:49 compute-0 sudo[65852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufzbqgoaptvbhidfflltiqdvtaprksxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050809.2731643-85-131700653648128/AnsiballZ_stat.py'
Nov 25 06:06:49 compute-0 sudo[65852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:49 compute-0 python3.9[65854]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:49 compute-0 sudo[65852]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:49 compute-0 sudo[65975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfqxadyuvndefxwyzpozsiatyeubpjgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050809.2731643-85-131700653648128/AnsiballZ_copy.py'
Nov 25 06:06:49 compute-0 sudo[65975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:50 compute-0 python3.9[65977]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050809.2731643-85-131700653648128/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:50 compute-0 sudo[65975]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:50 compute-0 sudo[66127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlbsvflkwybpfwulztwcxjykazdyxxzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050810.2073147-85-26323377972244/AnsiballZ_stat.py'
Nov 25 06:06:50 compute-0 sudo[66127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:50 compute-0 python3.9[66129]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:50 compute-0 sudo[66127]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:50 compute-0 sudo[66250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjlhflbpkeekofoxlnqmwuaemyiuayih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050810.2073147-85-26323377972244/AnsiballZ_copy.py'
Nov 25 06:06:50 compute-0 sudo[66250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:50 compute-0 python3.9[66252]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050810.2073147-85-26323377972244/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:06:50 compute-0 sudo[66250]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:51 compute-0 sudo[66402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqsgbsgkpvddusdvalxdtclgdoswycww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050811.0269694-114-120945922969969/AnsiballZ_file.py'
Nov 25 06:06:51 compute-0 sudo[66402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:51 compute-0 python3.9[66404]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:51 compute-0 sudo[66402]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:51 compute-0 sudo[66554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydplqeqslcgmxxbsujqgbfnxogmsbpth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050811.4728036-122-169088923960876/AnsiballZ_stat.py'
Nov 25 06:06:51 compute-0 sudo[66554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:51 compute-0 python3.9[66556]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:51 compute-0 sudo[66554]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:52 compute-0 sudo[66677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpnfxfavquzyexbsdaphbzfzclleqhjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050811.4728036-122-169088923960876/AnsiballZ_copy.py'
Nov 25 06:06:52 compute-0 sudo[66677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:52 compute-0 python3.9[66679]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050811.4728036-122-169088923960876/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:52 compute-0 sudo[66677]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:52 compute-0 sudo[66829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltkhdqznuthkeqnmhuojbwprwnsxfstb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050812.3013487-137-125835031827954/AnsiballZ_stat.py'
Nov 25 06:06:52 compute-0 sudo[66829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:52 compute-0 python3.9[66831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:52 compute-0 sudo[66829]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:52 compute-0 sudo[66952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfoxnqippgdubgijwwdhyfiucwthqure ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050812.3013487-137-125835031827954/AnsiballZ_copy.py'
Nov 25 06:06:52 compute-0 sudo[66952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:53 compute-0 python3.9[66954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050812.3013487-137-125835031827954/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:53 compute-0 sudo[66952]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:53 compute-0 sudo[67104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqbvtxpbmlzgknwvcmdbxqkrymlcvllr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050813.1391566-152-208154281487409/AnsiballZ_systemd.py'
Nov 25 06:06:53 compute-0 sudo[67104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:53 compute-0 python3.9[67106]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:06:53 compute-0 systemd[1]: Reloading.
Nov 25 06:06:53 compute-0 systemd-rc-local-generator[67129]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:06:53 compute-0 systemd-sysv-generator[67135]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:06:53 compute-0 systemd[1]: Reloading.
Nov 25 06:06:54 compute-0 systemd-sysv-generator[67168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:06:54 compute-0 systemd-rc-local-generator[67164]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:06:54 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 25 06:06:54 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 25 06:06:54 compute-0 sudo[67104]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:54 compute-0 sudo[67332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxadtnfqzcepwbudshsncktgvmichcbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050814.2951772-160-133117331613258/AnsiballZ_stat.py'
Nov 25 06:06:54 compute-0 sudo[67332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:54 compute-0 python3.9[67334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:54 compute-0 sudo[67332]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:54 compute-0 sudo[67455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sercoxchpkovbaoeskennxixmofbpikv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050814.2951772-160-133117331613258/AnsiballZ_copy.py'
Nov 25 06:06:54 compute-0 sudo[67455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:55 compute-0 python3.9[67457]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050814.2951772-160-133117331613258/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:55 compute-0 sudo[67455]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:55 compute-0 sudo[67607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfshpukdmpeewpwclbysxzidmsfuaapg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050815.1378503-175-191971300544255/AnsiballZ_stat.py'
Nov 25 06:06:55 compute-0 sudo[67607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:55 compute-0 python3.9[67609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:06:55 compute-0 sudo[67607]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:55 compute-0 sudo[67730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssaoomkxnehsibsqzcdqpvlrslfontvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050815.1378503-175-191971300544255/AnsiballZ_copy.py'
Nov 25 06:06:55 compute-0 sudo[67730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:55 compute-0 python3.9[67732]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050815.1378503-175-191971300544255/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:06:55 compute-0 sudo[67730]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:56 compute-0 sudo[67882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrkdrzllhpklpnnpazyygolgjvjjaxab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050815.9581585-190-225622256087436/AnsiballZ_systemd.py'
Nov 25 06:06:56 compute-0 sudo[67882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:56 compute-0 python3.9[67884]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:06:56 compute-0 systemd[1]: Reloading.
Nov 25 06:06:56 compute-0 systemd-sysv-generator[67908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:06:56 compute-0 systemd-rc-local-generator[67905]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:06:56 compute-0 systemd[1]: Reloading.
Nov 25 06:06:56 compute-0 systemd-sysv-generator[67948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:06:56 compute-0 systemd-rc-local-generator[67944]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:06:56 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 06:06:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 06:06:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 06:06:56 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 06:06:56 compute-0 sudo[67882]: pam_unix(sudo:session): session closed for user root
Nov 25 06:06:57 compute-0 python3.9[68110]: ansible-ansible.builtin.service_facts Invoked
Nov 25 06:06:57 compute-0 network[68127]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 06:06:57 compute-0 network[68128]: 'network-scripts' will be removed from distribution in near future.
Nov 25 06:06:57 compute-0 network[68129]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 06:06:59 compute-0 sudo[68389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nggaimsoprszruwuiarfzudyokkpixxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050819.2413683-206-119712692013703/AnsiballZ_systemd.py'
Nov 25 06:06:59 compute-0 sudo[68389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:06:59 compute-0 python3.9[68391]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:06:59 compute-0 systemd[1]: Reloading.
Nov 25 06:06:59 compute-0 systemd-rc-local-generator[68415]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:06:59 compute-0 systemd-sysv-generator[68418]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:06:59 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 25 06:07:00 compute-0 iptables.init[68432]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 25 06:07:00 compute-0 iptables.init[68432]: iptables: Flushing firewall rules: [  OK  ]
Nov 25 06:07:00 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 25 06:07:00 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 25 06:07:00 compute-0 sudo[68389]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:00 compute-0 sudo[68626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpxlzachsqdesezbsyjrfwykjzupnjgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050820.2951-206-145747482677818/AnsiballZ_systemd.py'
Nov 25 06:07:00 compute-0 sudo[68626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:00 compute-0 python3.9[68628]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:07:00 compute-0 sudo[68626]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:01 compute-0 sudo[68780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdjoniwzdasszuatjwiiaosvcvesyvuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050820.943679-222-68815496288020/AnsiballZ_systemd.py'
Nov 25 06:07:01 compute-0 sudo[68780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:01 compute-0 python3.9[68782]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:07:01 compute-0 systemd[1]: Reloading.
Nov 25 06:07:01 compute-0 systemd-sysv-generator[68815]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:07:01 compute-0 systemd-rc-local-generator[68805]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:07:01 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 25 06:07:01 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 25 06:07:01 compute-0 sudo[68780]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:02 compute-0 sudo[68973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyfopdhgvurqrurromllltnsdbcyxbtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050821.729861-230-70813710603269/AnsiballZ_command.py'
Nov 25 06:07:02 compute-0 sudo[68973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:02 compute-0 python3.9[68975]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:02 compute-0 sudo[68973]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:02 compute-0 sudo[69126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgkuzqdwbfsgfpnqthtidoftoyyjgcbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050822.5058746-244-27997213686660/AnsiballZ_stat.py'
Nov 25 06:07:02 compute-0 sudo[69126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:02 compute-0 python3.9[69128]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:02 compute-0 sudo[69126]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:03 compute-0 sudo[69251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzbqvhdredrkferhfxqtcubsiwvfzjcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050822.5058746-244-27997213686660/AnsiballZ_copy.py'
Nov 25 06:07:03 compute-0 sudo[69251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:03 compute-0 python3.9[69253]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050822.5058746-244-27997213686660/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:03 compute-0 sudo[69251]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:03 compute-0 sudo[69404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaqhkgpoxysbrapjzoefyxqbhsmpliwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050823.428973-259-168264715173659/AnsiballZ_systemd.py'
Nov 25 06:07:03 compute-0 sudo[69404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:03 compute-0 python3.9[69406]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:07:03 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 25 06:07:03 compute-0 sshd[962]: Received SIGHUP; restarting.
Nov 25 06:07:03 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 25 06:07:03 compute-0 sshd[962]: Server listening on 0.0.0.0 port 22.
Nov 25 06:07:03 compute-0 sshd[962]: Server listening on :: port 22.
Nov 25 06:07:03 compute-0 sudo[69404]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:04 compute-0 sudo[69560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrssagltmtdureehuviqsamuhossqvya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050824.0440497-267-132722898036793/AnsiballZ_file.py'
Nov 25 06:07:04 compute-0 sudo[69560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:04 compute-0 python3.9[69562]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:04 compute-0 sudo[69560]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:04 compute-0 sudo[69712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ferngrqcwdajltabiizjptqcsjmnelhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050824.5212367-275-42764382666026/AnsiballZ_stat.py'
Nov 25 06:07:04 compute-0 sudo[69712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:04 compute-0 python3.9[69714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:04 compute-0 sudo[69712]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:05 compute-0 sudo[69835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmouglniymwqvjbmbfumkzqmsofnloam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050824.5212367-275-42764382666026/AnsiballZ_copy.py'
Nov 25 06:07:05 compute-0 sudo[69835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:05 compute-0 python3.9[69837]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050824.5212367-275-42764382666026/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:05 compute-0 sudo[69835]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:05 compute-0 sudo[69987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqotnfofneappgnarjnqhhznigxsmaoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050825.4609296-293-22796563598356/AnsiballZ_timezone.py'
Nov 25 06:07:05 compute-0 sudo[69987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:05 compute-0 python3.9[69989]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 25 06:07:05 compute-0 systemd[1]: Starting Time & Date Service...
Nov 25 06:07:05 compute-0 systemd[1]: Started Time & Date Service.
Nov 25 06:07:06 compute-0 sudo[69987]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:06 compute-0 sudo[70143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhsrnmglenygyfeahpsjrjqjcfhsdydn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050826.1717453-302-117397324404593/AnsiballZ_file.py'
Nov 25 06:07:06 compute-0 sudo[70143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:06 compute-0 python3.9[70145]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:06 compute-0 sudo[70143]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:06 compute-0 sudo[70295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiemighbmopnrfltxvvdwtrodhalvhkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050826.6362212-310-59249404314539/AnsiballZ_stat.py'
Nov 25 06:07:06 compute-0 sudo[70295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:06 compute-0 python3.9[70297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:06 compute-0 sudo[70295]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:07 compute-0 sudo[70418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnzxgxmgvdnmhrzhibuulgqmigeorsot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050826.6362212-310-59249404314539/AnsiballZ_copy.py'
Nov 25 06:07:07 compute-0 sudo[70418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:07 compute-0 python3.9[70420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050826.6362212-310-59249404314539/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:07 compute-0 sudo[70418]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:07 compute-0 sudo[70570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktjjhupojkvvyhbdqjshojlrcnikywsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050827.481014-325-59328795206822/AnsiballZ_stat.py'
Nov 25 06:07:07 compute-0 sudo[70570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:07 compute-0 python3.9[70572]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:07 compute-0 sudo[70570]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:08 compute-0 sudo[70693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpgrtxdsfpeobilmspzmarysmyrsfbdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050827.481014-325-59328795206822/AnsiballZ_copy.py'
Nov 25 06:07:08 compute-0 sudo[70693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:08 compute-0 python3.9[70695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050827.481014-325-59328795206822/.source.yaml _original_basename=.8z5_p6ml follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:08 compute-0 sudo[70693]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:08 compute-0 sudo[70845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpcylezmmzgmphxqycyeqxztvqdtbbfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050828.4345994-340-69190067117115/AnsiballZ_stat.py'
Nov 25 06:07:08 compute-0 sudo[70845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:08 compute-0 python3.9[70847]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:08 compute-0 sudo[70845]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:08 compute-0 sudo[70968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwiwlocmpqatahpahwsflabhvfjrygrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050828.4345994-340-69190067117115/AnsiballZ_copy.py'
Nov 25 06:07:08 compute-0 sudo[70968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:09 compute-0 python3.9[70970]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050828.4345994-340-69190067117115/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:09 compute-0 sudo[70968]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:09 compute-0 sudo[71120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhfjzmrvkyxerrmcbxmatznfwodtvjqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050829.272457-355-31350671851170/AnsiballZ_command.py'
Nov 25 06:07:09 compute-0 sudo[71120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:09 compute-0 python3.9[71122]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:09 compute-0 sudo[71120]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:09 compute-0 sudo[71273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwwlncwuquyltkupfobkaqfdldtdybci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050829.7417796-363-142766035782635/AnsiballZ_command.py'
Nov 25 06:07:09 compute-0 sudo[71273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:10 compute-0 python3.9[71275]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:10 compute-0 sudo[71273]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:10 compute-0 sudo[71426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzsyobwmqemswnfappynktirtzavswxj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764050830.190276-371-114409188491441/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 06:07:10 compute-0 sudo[71426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:10 compute-0 python3[71428]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 06:07:10 compute-0 sudo[71426]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:10 compute-0 sudo[71578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfxhdlraginavkvmjdjrxyrswlgdohtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050830.7694623-379-54674668889926/AnsiballZ_stat.py'
Nov 25 06:07:10 compute-0 sudo[71578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:11 compute-0 python3.9[71580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:11 compute-0 sudo[71578]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:11 compute-0 sudo[71701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftcdsyzlrpwaqbspwcafhoshlygtrhpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050830.7694623-379-54674668889926/AnsiballZ_copy.py'
Nov 25 06:07:11 compute-0 sudo[71701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:11 compute-0 python3.9[71703]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050830.7694623-379-54674668889926/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:11 compute-0 sudo[71701]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:11 compute-0 sudo[71853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjeegvgxufpzspbptsgskiqssmrjwtfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050831.6213572-394-163558132833293/AnsiballZ_stat.py'
Nov 25 06:07:11 compute-0 sudo[71853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:11 compute-0 python3.9[71855]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:12 compute-0 sudo[71853]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:12 compute-0 sudo[71976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lopzcelawtwzqvkkommobcxmqvrhlrof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050831.6213572-394-163558132833293/AnsiballZ_copy.py'
Nov 25 06:07:12 compute-0 sudo[71976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:12 compute-0 python3.9[71978]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050831.6213572-394-163558132833293/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:12 compute-0 sudo[71976]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:12 compute-0 sudo[72128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlfuqwrzcqxwlawobrqoyhantylieeeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050832.495047-409-265220949272605/AnsiballZ_stat.py'
Nov 25 06:07:12 compute-0 sudo[72128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:12 compute-0 python3.9[72130]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:12 compute-0 sudo[72128]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:13 compute-0 sudo[72251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogorjpxqshjemmbattyhlipmfuwoziag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050832.495047-409-265220949272605/AnsiballZ_copy.py'
Nov 25 06:07:13 compute-0 sudo[72251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:13 compute-0 python3.9[72253]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050832.495047-409-265220949272605/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:13 compute-0 sudo[72251]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:13 compute-0 sudo[72403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssbzonrtynsceyqabawgaafeyodmbatc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050833.4622135-424-6425996872414/AnsiballZ_stat.py'
Nov 25 06:07:13 compute-0 sudo[72403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:13 compute-0 python3.9[72405]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:13 compute-0 sudo[72403]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:14 compute-0 sudo[72526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egmxkrzizespxxpvcqlvmbuctucinfzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050833.4622135-424-6425996872414/AnsiballZ_copy.py'
Nov 25 06:07:14 compute-0 sudo[72526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:14 compute-0 python3.9[72528]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050833.4622135-424-6425996872414/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:14 compute-0 sudo[72526]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:14 compute-0 sudo[72678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryblwcmmsmlfkespsedmuamsgixbpkor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050834.31427-439-79716573957501/AnsiballZ_stat.py'
Nov 25 06:07:14 compute-0 sudo[72678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:14 compute-0 python3.9[72680]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:14 compute-0 sudo[72678]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:14 compute-0 sudo[72801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mshjxdkpjtqevzrvkoewynoyfsaneift ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050834.31427-439-79716573957501/AnsiballZ_copy.py'
Nov 25 06:07:14 compute-0 sudo[72801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:15 compute-0 python3.9[72803]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050834.31427-439-79716573957501/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:15 compute-0 sudo[72801]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:15 compute-0 sudo[72953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vddxweqyzhlubootsoxecoiuereevtzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050835.2090166-454-112181327582649/AnsiballZ_file.py'
Nov 25 06:07:15 compute-0 sudo[72953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:15 compute-0 python3.9[72955]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:15 compute-0 sudo[72953]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:15 compute-0 sudo[73105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohjaqnnvdiafcleszpjbaryjysjxngbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050835.6632357-462-248840253436091/AnsiballZ_command.py'
Nov 25 06:07:15 compute-0 sudo[73105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:15 compute-0 python3.9[73107]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:16 compute-0 sudo[73105]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:16 compute-0 sudo[73264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tntyznpmfdvtkjthwnllthnlpuetdceq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050836.1464756-470-148261388892743/AnsiballZ_blockinfile.py'
Nov 25 06:07:16 compute-0 sudo[73264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:16 compute-0 python3.9[73266]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:16 compute-0 sudo[73264]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:16 compute-0 sudo[73417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptzbgwebzgigvhacmrdpdwjnoxzlclzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050836.824033-479-101533884728899/AnsiballZ_file.py'
Nov 25 06:07:16 compute-0 sudo[73417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:17 compute-0 python3.9[73419]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:17 compute-0 sudo[73417]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:17 compute-0 sudo[73569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojwolytnggsicahbsczieemdpguidicd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050837.2607648-479-142000100193547/AnsiballZ_file.py'
Nov 25 06:07:17 compute-0 sudo[73569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:17 compute-0 python3.9[73571]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:17 compute-0 sudo[73569]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:18 compute-0 sudo[73721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfnmzshspnrrnuwiqzkiqkltwxidaknz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050837.8416917-494-38319488047288/AnsiballZ_mount.py'
Nov 25 06:07:18 compute-0 sudo[73721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:18 compute-0 python3.9[73723]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 25 06:07:18 compute-0 sudo[73721]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:18 compute-0 sudo[73874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lszppkmqlvsqtywjealnaucjeeznyxmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050838.4612985-494-240025549821904/AnsiballZ_mount.py'
Nov 25 06:07:18 compute-0 sudo[73874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:18 compute-0 python3.9[73876]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 25 06:07:18 compute-0 sudo[73874]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:19 compute-0 sshd-session[64716]: Connection closed by 192.168.122.30 port 54946
Nov 25 06:07:19 compute-0 sshd-session[64713]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:07:19 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 25 06:07:19 compute-0 systemd[1]: session-14.scope: Consumed 23.820s CPU time.
Nov 25 06:07:19 compute-0 systemd-logind[744]: Session 14 logged out. Waiting for processes to exit.
Nov 25 06:07:19 compute-0 systemd-logind[744]: Removed session 14.
Nov 25 06:07:24 compute-0 sshd-session[73902]: Accepted publickey for zuul from 192.168.122.30 port 46838 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:07:24 compute-0 systemd-logind[744]: New session 15 of user zuul.
Nov 25 06:07:24 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 25 06:07:24 compute-0 sshd-session[73902]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:07:24 compute-0 sudo[74055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfmgflcxoxwktfkotgickwlouapztvyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050844.4515092-16-21833628446666/AnsiballZ_tempfile.py'
Nov 25 06:07:24 compute-0 sudo[74055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:24 compute-0 python3.9[74057]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 25 06:07:24 compute-0 sudo[74055]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:24 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 06:07:25 compute-0 sudo[74208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilbhlsusigrafslwdmghvpeuucjlykpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050845.0425525-28-44657372287179/AnsiballZ_stat.py'
Nov 25 06:07:25 compute-0 sudo[74208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:25 compute-0 python3.9[74210]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:07:25 compute-0 sudo[74208]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:26 compute-0 sudo[74360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beejxcxsmrpamgssqoopjbkqmiwzhiea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050845.6288972-38-201172072989710/AnsiballZ_setup.py'
Nov 25 06:07:26 compute-0 sudo[74360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:26 compute-0 python3.9[74362]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:07:26 compute-0 sudo[74360]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:26 compute-0 sudo[74512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nomciwxcdqhourilvlldbvrduggkodxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050846.484711-47-257696433425459/AnsiballZ_blockinfile.py'
Nov 25 06:07:26 compute-0 sudo[74512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:26 compute-0 python3.9[74514]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCQUJm/OoZehidASD6+fzFJq9VKK0orOq9+vqE+nl5PLNBalTM2ffTaw5Njn+XoQPmvIqNGDNbcL9bMv1IZ8/67ldtTR1gYzP8WmZhR2yf+J8LbKO0KDX3tmtfQu3TqDr+rr1B8JawfignLZyP/IlMTIUJSZTOgGb9w8mw2aCgaSOxEhPZACNE4kGinyO0rhw/8FN9pOhhazwGt9r0KAp+A8eKRKrMkCNJzKhd9AChhqXSCpZEQH1MwdycR7huMAxjwSkt4B/PV4XDyURMGBlUuNyLMjfW+alz7N31RHGk00DGzCIiFq0Tf/Gd6cMN26l1iYx9gXCvN3uGbrd77Bb+1GEh7O+ynQrtMejsg92QxhmXfCxfK2VOucbpX3gUl0DdzchwbYl4AQdOAk3nt7HLYBj4EBovXCEwt5LOPfCFI3WWANo0HjHPW/sWOB77OChQU5WmTCUpxE/+cC/Mp5Xmvj/yTTwnxJFDBg4FMPV/lYa+5HS89dMJPBPJPF+TLAac=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPevRf6KUZlUkB+0Nz5tMACGagyGeaPeYItZOqTz4O9z
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDJ19su2E/b4wM9U5Ns6ALKR7wEQg86wMIbo+krFP9KPp/kp6XtS82izWYCkc8sZsvjKxIr1w6Fr7i+hcCyTX0Y=
                                             create=True mode=0644 path=/tmp/ansible.k2rtz461 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:26 compute-0 sudo[74512]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:27 compute-0 sudo[74664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxmgrjiliffsshuqqwuyczmbsklgxtvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050847.0430262-55-14446339599962/AnsiballZ_command.py'
Nov 25 06:07:27 compute-0 sudo[74664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:27 compute-0 python3.9[74666]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.k2rtz461' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:27 compute-0 sudo[74664]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:27 compute-0 sudo[74818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzfcftofyzvonmcggkkrufoguvzypplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050847.6058044-63-163202264403757/AnsiballZ_file.py'
Nov 25 06:07:27 compute-0 sudo[74818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:28 compute-0 python3.9[74820]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.k2rtz461 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:28 compute-0 sudo[74818]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:28 compute-0 sshd-session[73905]: Connection closed by 192.168.122.30 port 46838
Nov 25 06:07:28 compute-0 sshd-session[73902]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:07:28 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 25 06:07:28 compute-0 systemd[1]: session-15.scope: Consumed 2.306s CPU time.
Nov 25 06:07:28 compute-0 systemd-logind[744]: Session 15 logged out. Waiting for processes to exit.
Nov 25 06:07:28 compute-0 systemd-logind[744]: Removed session 15.
Nov 25 06:07:33 compute-0 sshd-session[74845]: Accepted publickey for zuul from 192.168.122.30 port 40452 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:07:33 compute-0 systemd-logind[744]: New session 16 of user zuul.
Nov 25 06:07:33 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 25 06:07:33 compute-0 sshd-session[74845]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:07:34 compute-0 python3.9[74998]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:07:35 compute-0 sudo[75152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyjpvscuzlgdikrywcfwfjmjuofvkxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050854.6072726-32-51028581379188/AnsiballZ_systemd.py'
Nov 25 06:07:35 compute-0 sudo[75152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:35 compute-0 python3.9[75154]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 25 06:07:35 compute-0 sudo[75152]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:35 compute-0 sudo[75306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjcbgndrywjtnyhwtktbaffrblglieci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050855.4345925-40-121076900811676/AnsiballZ_systemd.py'
Nov 25 06:07:35 compute-0 sudo[75306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:35 compute-0 python3.9[75308]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:07:35 compute-0 sudo[75306]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:36 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 25 06:07:36 compute-0 sudo[75461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtufvnbwyhgldsbdliowpubeiaxptwpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050856.027635-49-235571470336054/AnsiballZ_command.py'
Nov 25 06:07:36 compute-0 sudo[75461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:36 compute-0 python3.9[75463]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:36 compute-0 sudo[75461]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:36 compute-0 sudo[75614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rllxcccshophmqnaoiczerovvhaojidy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050856.5939155-57-187210900546136/AnsiballZ_stat.py'
Nov 25 06:07:36 compute-0 sudo[75614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:37 compute-0 python3.9[75616]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:07:37 compute-0 sudo[75614]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:37 compute-0 sudo[75768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmfqrpmjltqfskdhccctuitspzouskjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050857.172308-65-10970104837657/AnsiballZ_command.py'
Nov 25 06:07:37 compute-0 sudo[75768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:37 compute-0 python3.9[75770]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:37 compute-0 sudo[75768]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:37 compute-0 sudo[75923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybkujaiyselgjgftcnmqgdzioefliqhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050857.6230457-73-146385826388444/AnsiballZ_file.py'
Nov 25 06:07:37 compute-0 sudo[75923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:38 compute-0 python3.9[75925]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:38 compute-0 sudo[75923]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:38 compute-0 sshd-session[74848]: Connection closed by 192.168.122.30 port 40452
Nov 25 06:07:38 compute-0 sshd-session[74845]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:07:38 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 25 06:07:38 compute-0 systemd[1]: session-16.scope: Consumed 3.026s CPU time.
Nov 25 06:07:38 compute-0 systemd-logind[744]: Session 16 logged out. Waiting for processes to exit.
Nov 25 06:07:38 compute-0 systemd-logind[744]: Removed session 16.
Nov 25 06:07:43 compute-0 sshd-session[75950]: Accepted publickey for zuul from 192.168.122.30 port 33082 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:07:43 compute-0 systemd-logind[744]: New session 17 of user zuul.
Nov 25 06:07:43 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 25 06:07:43 compute-0 sshd-session[75950]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:07:44 compute-0 python3.9[76103]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:07:44 compute-0 sudo[76257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clmpbmubxbzzmkjavzhgcqnsxnkelcvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050864.5526664-34-112224463641309/AnsiballZ_setup.py'
Nov 25 06:07:44 compute-0 sudo[76257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:44 compute-0 python3.9[76259]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:07:45 compute-0 sudo[76257]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:45 compute-0 sudo[76341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqulfiybbmfhxtnyowpbcoubwwrpxfbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050864.5526664-34-112224463641309/AnsiballZ_dnf.py'
Nov 25 06:07:45 compute-0 sudo[76341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:45 compute-0 python3.9[76343]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 06:07:46 compute-0 sudo[76341]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:47 compute-0 python3.9[76494]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:07:48 compute-0 python3.9[76645]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 06:07:48 compute-0 python3.9[76795]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:07:49 compute-0 python3.9[76945]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:07:49 compute-0 sshd-session[75953]: Connection closed by 192.168.122.30 port 33082
Nov 25 06:07:49 compute-0 sshd-session[75950]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:07:49 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 25 06:07:49 compute-0 systemd[1]: session-17.scope: Consumed 4.210s CPU time.
Nov 25 06:07:49 compute-0 systemd-logind[744]: Session 17 logged out. Waiting for processes to exit.
Nov 25 06:07:49 compute-0 systemd-logind[744]: Removed session 17.
Nov 25 06:07:54 compute-0 sshd-session[76970]: Accepted publickey for zuul from 192.168.122.30 port 59406 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:07:54 compute-0 systemd-logind[744]: New session 18 of user zuul.
Nov 25 06:07:54 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 25 06:07:54 compute-0 sshd-session[76970]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:07:55 compute-0 python3.9[77123]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:07:56 compute-0 sudo[77277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsgjvurbeawbzmvfsimicpswowemrmuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050876.4944034-50-58431166242510/AnsiballZ_file.py'
Nov 25 06:07:56 compute-0 sudo[77277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:56 compute-0 python3.9[77279]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:07:57 compute-0 sudo[77277]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:57 compute-0 sudo[77429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbtmuspmnhzzmjaosbpdrttrgwqkihmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050877.0884616-50-39716263385389/AnsiballZ_file.py'
Nov 25 06:07:57 compute-0 sudo[77429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:57 compute-0 python3.9[77431]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:07:57 compute-0 sudo[77429]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:57 compute-0 sudo[77581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbkprcrkbibhcweazjathroiviwhkjoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050877.5404496-65-118007747694958/AnsiballZ_stat.py'
Nov 25 06:07:57 compute-0 sudo[77581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:57 compute-0 python3.9[77583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:57 compute-0 sudo[77581]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:58 compute-0 sudo[77704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mckmsoyekmwjxpsjooguhjqanfackwoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050877.5404496-65-118007747694958/AnsiballZ_copy.py'
Nov 25 06:07:58 compute-0 sudo[77704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:58 compute-0 python3.9[77706]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050877.5404496-65-118007747694958/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=16c56d387948a2eb8e13582eb1218b68286dd427 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:58 compute-0 sudo[77704]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:58 compute-0 sudo[77856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgldhpgtfdpuqiucgwxsfqwasgvljocy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050878.5606947-65-167247216708073/AnsiballZ_stat.py'
Nov 25 06:07:58 compute-0 sudo[77856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:58 compute-0 python3.9[77858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:58 compute-0 sudo[77856]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:59 compute-0 sudo[77979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wktsoxxddbhysxmduaspkhunmeofxlho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050878.5606947-65-167247216708073/AnsiballZ_copy.py'
Nov 25 06:07:59 compute-0 sudo[77979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:59 compute-0 python3.9[77981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050878.5606947-65-167247216708073/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=2d91b9783e938c188656f29ca75ed1ba893d2fa6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:07:59 compute-0 sudo[77979]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:59 compute-0 sudo[78131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlpvajrbwkqiwbkjkadccqgxrbsxujfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050879.3466463-65-236134251281803/AnsiballZ_stat.py'
Nov 25 06:07:59 compute-0 sudo[78131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:07:59 compute-0 python3.9[78133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:07:59 compute-0 sudo[78131]: pam_unix(sudo:session): session closed for user root
Nov 25 06:07:59 compute-0 sudo[78254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zouljuolnempnjqxoqbobafwuckmupha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050879.3466463-65-236134251281803/AnsiballZ_copy.py'
Nov 25 06:07:59 compute-0 sudo[78254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:00 compute-0 python3.9[78256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050879.3466463-65-236134251281803/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=d7e411972f5ca9113ffdc4b1383845778118a7ea backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:00 compute-0 sudo[78254]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:00 compute-0 sudo[78406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgbwwyhyhztivysrmggbptwbfhlbdxvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050880.196044-109-171412069287788/AnsiballZ_file.py'
Nov 25 06:08:00 compute-0 sudo[78406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:00 compute-0 python3.9[78408]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:00 compute-0 sudo[78406]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:00 compute-0 sudo[78558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxdkbxvctmurvvbooliwssqputzonmtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050880.6145673-109-255598349352287/AnsiballZ_file.py'
Nov 25 06:08:00 compute-0 sudo[78558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:00 compute-0 python3.9[78560]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:00 compute-0 sudo[78558]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:01 compute-0 sudo[78710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geuhcbwttlejmycxtckajgqijzjmawsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050881.0747104-124-247689385828488/AnsiballZ_stat.py'
Nov 25 06:08:01 compute-0 sudo[78710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:01 compute-0 python3.9[78712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:01 compute-0 sudo[78710]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:01 compute-0 sudo[78833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvgmgjjbvxngwmumzgsgudkzerdxvmwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050881.0747104-124-247689385828488/AnsiballZ_copy.py'
Nov 25 06:08:01 compute-0 sudo[78833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:01 compute-0 python3.9[78835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050881.0747104-124-247689385828488/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=5766d948a0a0cb312f168ec1ff9505055845dcd0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:01 compute-0 sudo[78833]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:02 compute-0 sudo[78985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkpopyjnbkebzkamkfvrwjjxlrnbpfrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050881.8972394-124-135527621653627/AnsiballZ_stat.py'
Nov 25 06:08:02 compute-0 sudo[78985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:02 compute-0 python3.9[78987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:02 compute-0 sudo[78985]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:02 compute-0 sudo[79108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnuykrdijubbxzdctaxmwtjijnnauwfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050881.8972394-124-135527621653627/AnsiballZ_copy.py'
Nov 25 06:08:02 compute-0 sudo[79108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:02 compute-0 python3.9[79110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050881.8972394-124-135527621653627/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=7d08aa4d09e91909c3c1184a36de7dbe8759a29b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:02 compute-0 sudo[79108]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:02 compute-0 sudo[79260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmijtrjzoxaiaqrvjjlwqwnewdoojhis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050882.6877031-124-17040289814892/AnsiballZ_stat.py'
Nov 25 06:08:02 compute-0 sudo[79260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:03 compute-0 python3.9[79262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:03 compute-0 sudo[79260]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:03 compute-0 sudo[79383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlvkvnaxwstbwfrgpxzoolvtqkrteldt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050882.6877031-124-17040289814892/AnsiballZ_copy.py'
Nov 25 06:08:03 compute-0 sudo[79383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:03 compute-0 python3.9[79385]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050882.6877031-124-17040289814892/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=ca2608f2f6b3d4c6b98fb6eeb0f19556db0c451f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:03 compute-0 sudo[79383]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:03 compute-0 sudo[79535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrochuyltgtsnmdflpyhahvmvmcgdjcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050883.5335815-168-12297380424929/AnsiballZ_file.py'
Nov 25 06:08:03 compute-0 sudo[79535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:03 compute-0 python3.9[79537]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:03 compute-0 sudo[79535]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:04 compute-0 sudo[79687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oetgeuycqfbrjrltkhsijjhorhiaubjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050883.964927-168-211442167479002/AnsiballZ_file.py'
Nov 25 06:08:04 compute-0 sudo[79687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:04 compute-0 python3.9[79689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:04 compute-0 sudo[79687]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:04 compute-0 sudo[79839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlymdwzzbjutnxcpxekwkofeezrwctyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050884.432825-183-80322591552981/AnsiballZ_stat.py'
Nov 25 06:08:04 compute-0 sudo[79839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:04 compute-0 python3.9[79841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:04 compute-0 sudo[79839]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:04 compute-0 sudo[79962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pluafpsdauifpqxdhotljjftebbnyvyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050884.432825-183-80322591552981/AnsiballZ_copy.py'
Nov 25 06:08:04 compute-0 sudo[79962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:05 compute-0 python3.9[79964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050884.432825-183-80322591552981/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bffc59b622e587e7de1ec308844cf7f81e72dcaf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:05 compute-0 sudo[79962]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:05 compute-0 sudo[80114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjkrvivuiiwehanxpzkvqxlvsitcmjuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050885.2479577-183-99458897795664/AnsiballZ_stat.py'
Nov 25 06:08:05 compute-0 sudo[80114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:05 compute-0 python3.9[80116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:05 compute-0 sudo[80114]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:05 compute-0 sudo[80237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfaqrbrrmgacratayuvynvstccyhmthh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050885.2479577-183-99458897795664/AnsiballZ_copy.py'
Nov 25 06:08:05 compute-0 sudo[80237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:05 compute-0 python3.9[80239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050885.2479577-183-99458897795664/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=183c4804b2e6f26e9a47ce76ccf288e738229c16 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:05 compute-0 sudo[80237]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:06 compute-0 sudo[80389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gneydwesfxxznjmvqfcytcftxhymoezn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050886.077101-183-16290220624274/AnsiballZ_stat.py'
Nov 25 06:08:06 compute-0 sudo[80389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:06 compute-0 python3.9[80391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:06 compute-0 sudo[80389]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:06 compute-0 sudo[80512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfjbzunigjjoncgyqeonxkcmteugjngw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050886.077101-183-16290220624274/AnsiballZ_copy.py'
Nov 25 06:08:06 compute-0 sudo[80512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:06 compute-0 python3.9[80514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050886.077101-183-16290220624274/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=21ba32932ef820e120851ed9538be07a51909241 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:06 compute-0 sudo[80512]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:07 compute-0 sudo[80664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-funjftigrzkwmjdqstsaoxqvmaiycjpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050886.9541101-227-106393259137673/AnsiballZ_file.py'
Nov 25 06:08:07 compute-0 sudo[80664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:07 compute-0 python3.9[80666]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:07 compute-0 sudo[80664]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:07 compute-0 sudo[80816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjrvrqtlzyqvvptnqesnthroncmdlhpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050887.3885906-227-173018322001061/AnsiballZ_file.py'
Nov 25 06:08:07 compute-0 sudo[80816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:07 compute-0 python3.9[80818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:07 compute-0 sudo[80816]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:08 compute-0 sudo[80968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stzzniaqwapuxhabuwmmnhyzdffyfrgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050887.848622-242-130671708629586/AnsiballZ_stat.py'
Nov 25 06:08:08 compute-0 sudo[80968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:08 compute-0 python3.9[80970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:08 compute-0 sudo[80968]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:08 compute-0 sudo[81091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aewfxqudiczssrsevrjzqhnqvmpbfqjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050887.848622-242-130671708629586/AnsiballZ_copy.py'
Nov 25 06:08:08 compute-0 sudo[81091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:08 compute-0 python3.9[81093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050887.848622-242-130671708629586/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c468d9cc9e0eb8eead8cd0ea404509e71b0393dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:08 compute-0 sudo[81091]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:08 compute-0 sudo[81243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lamsseciqbajzrambahqhbfkgtlfghdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050888.6485634-242-260823586030357/AnsiballZ_stat.py'
Nov 25 06:08:08 compute-0 sudo[81243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:08 compute-0 python3.9[81245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:08 compute-0 sudo[81243]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:09 compute-0 sudo[81366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aworrdhlgzhnesuptevzmjuosglfjlnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050888.6485634-242-260823586030357/AnsiballZ_copy.py'
Nov 25 06:08:09 compute-0 sudo[81366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:09 compute-0 python3.9[81368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050888.6485634-242-260823586030357/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=183c4804b2e6f26e9a47ce76ccf288e738229c16 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:09 compute-0 sudo[81366]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:09 compute-0 sudo[81518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqlgsgdptafmnfohwhxkgaypmjzfpldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050889.4321764-242-149670871750622/AnsiballZ_stat.py'
Nov 25 06:08:09 compute-0 sudo[81518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:09 compute-0 python3.9[81520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:09 compute-0 sudo[81518]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:09 compute-0 sudo[81641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuladjldtleivkbuuywpznnvpoirxtug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050889.4321764-242-149670871750622/AnsiballZ_copy.py'
Nov 25 06:08:09 compute-0 sudo[81641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:10 compute-0 python3.9[81643]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050889.4321764-242-149670871750622/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=14ff97ecce6c13584cc172f2f1709bbdbf57f0c6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:10 compute-0 sudo[81641]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:10 compute-0 sudo[81793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxiqabzftpwodevwlnrwvxfbiwinyzyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050890.7240834-302-109163367299645/AnsiballZ_file.py'
Nov 25 06:08:10 compute-0 sudo[81793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:11 compute-0 python3.9[81795]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:11 compute-0 sudo[81793]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:11 compute-0 sudo[81945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drmxtdpdaadveqzouepfqcxrzdxkrewk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050891.1760283-310-269861699047479/AnsiballZ_stat.py'
Nov 25 06:08:11 compute-0 sudo[81945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:11 compute-0 python3.9[81947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:11 compute-0 sudo[81945]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:11 compute-0 sudo[82068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsylhlolzlenwbqxugcihypjvkvtslur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050891.1760283-310-269861699047479/AnsiballZ_copy.py'
Nov 25 06:08:11 compute-0 sudo[82068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:11 compute-0 python3.9[82070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050891.1760283-310-269861699047479/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:11 compute-0 sudo[82068]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:12 compute-0 sudo[82220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbjfqcettzfhhkvtvxnynawpnzuharvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050892.0348816-326-11722088575241/AnsiballZ_file.py'
Nov 25 06:08:12 compute-0 sudo[82220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:12 compute-0 python3.9[82222]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:12 compute-0 sudo[82220]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:12 compute-0 sudo[82372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofuzthpicyrxteszgkixyazvwvlpoohw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050892.4834483-334-273137772867729/AnsiballZ_stat.py'
Nov 25 06:08:12 compute-0 sudo[82372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:12 compute-0 python3.9[82374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:12 compute-0 sudo[82372]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:13 compute-0 sudo[82495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyfkxwsyukbkbdkssdqcjuustypgcipj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050892.4834483-334-273137772867729/AnsiballZ_copy.py'
Nov 25 06:08:13 compute-0 sudo[82495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:13 compute-0 python3.9[82497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050892.4834483-334-273137772867729/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:13 compute-0 sudo[82495]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:13 compute-0 sudo[82647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mexjnsgwkcddswhatztjelnwibcwinod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050893.3599675-350-230251873819929/AnsiballZ_file.py'
Nov 25 06:08:13 compute-0 sudo[82647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:13 compute-0 python3.9[82649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:13 compute-0 sudo[82647]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:14 compute-0 sudo[82799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tspouekhdvvdsyjrznivetvayznllfzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050893.833663-358-161404251583061/AnsiballZ_stat.py'
Nov 25 06:08:14 compute-0 sudo[82799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:14 compute-0 python3.9[82801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:14 compute-0 sudo[82799]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:14 compute-0 sudo[82922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbheefvxewmeiuigrwcrdookqbmcqetk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050893.833663-358-161404251583061/AnsiballZ_copy.py'
Nov 25 06:08:14 compute-0 sudo[82922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:14 compute-0 python3.9[82924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050893.833663-358-161404251583061/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:14 compute-0 sudo[82922]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:14 compute-0 sudo[83074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ultgefvodueafkwufhmgohzlbwqhhblc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050894.7296782-374-60238624060783/AnsiballZ_file.py'
Nov 25 06:08:14 compute-0 sudo[83074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:15 compute-0 python3.9[83076]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:15 compute-0 sudo[83074]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:15 compute-0 sudo[83226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itftunuytfycgeimytzejihsgsrcswhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050895.1924436-382-14245171319651/AnsiballZ_stat.py'
Nov 25 06:08:15 compute-0 sudo[83226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:15 compute-0 python3.9[83228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:15 compute-0 sudo[83226]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:15 compute-0 sudo[83349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrycrswsiqqsibriknjsrrttwzxqdaik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050895.1924436-382-14245171319651/AnsiballZ_copy.py'
Nov 25 06:08:15 compute-0 sudo[83349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:15 compute-0 python3.9[83351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050895.1924436-382-14245171319651/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:15 compute-0 sudo[83349]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:16 compute-0 sudo[83501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agvmrvxazvzznrhupgymioefpjxelvrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050896.0649426-398-21451362121995/AnsiballZ_file.py'
Nov 25 06:08:16 compute-0 sudo[83501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:16 compute-0 python3.9[83503]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:16 compute-0 sudo[83501]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:16 compute-0 sudo[83653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ervxduiotkqqrkxnzmgsqocdpoejmlej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050896.5237691-406-172297753659186/AnsiballZ_stat.py'
Nov 25 06:08:16 compute-0 sudo[83653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:16 compute-0 python3.9[83655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:16 compute-0 sudo[83653]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:17 compute-0 sudo[83776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmeumiqijjixuuhfzociaciqxabyaydy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050896.5237691-406-172297753659186/AnsiballZ_copy.py'
Nov 25 06:08:17 compute-0 sudo[83776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:17 compute-0 python3.9[83778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050896.5237691-406-172297753659186/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:17 compute-0 sudo[83776]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:17 compute-0 sudo[83928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfjahlzkvrhqcetqgrxpxbutnwmbnrpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050897.4240966-422-234669356890157/AnsiballZ_file.py'
Nov 25 06:08:17 compute-0 sudo[83928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:17 compute-0 python3.9[83930]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:17 compute-0 sudo[83928]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:18 compute-0 sudo[84080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpidyjebnncbhckrvwfgetnwubqboevj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050897.881171-430-101679943628980/AnsiballZ_stat.py'
Nov 25 06:08:18 compute-0 sudo[84080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:18 compute-0 python3.9[84082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:18 compute-0 sudo[84080]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:18 compute-0 sudo[84203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etrvvytfgzwtxyohssoysewmxpatlxbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050897.881171-430-101679943628980/AnsiballZ_copy.py'
Nov 25 06:08:18 compute-0 sudo[84203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:18 compute-0 python3.9[84205]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050897.881171-430-101679943628980/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:18 compute-0 sudo[84203]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:18 compute-0 sudo[84355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nerqtymrupanvtgfavekbwdrcyevdkkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050898.7590437-446-17888006968230/AnsiballZ_file.py'
Nov 25 06:08:18 compute-0 sudo[84355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:19 compute-0 python3.9[84357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:19 compute-0 sudo[84355]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:19 compute-0 sudo[84507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciomzxncmealtfggvkzxgevirehipnln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050899.2332518-454-84421419661719/AnsiballZ_stat.py'
Nov 25 06:08:19 compute-0 sudo[84507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:19 compute-0 python3.9[84509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:19 compute-0 sudo[84507]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:19 compute-0 sudo[84630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dksosulzofbbqdhbpaoxrmscwzjmkybf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050899.2332518-454-84421419661719/AnsiballZ_copy.py'
Nov 25 06:08:19 compute-0 sudo[84630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:19 compute-0 python3.9[84632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050899.2332518-454-84421419661719/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f66d7420451d7e559fc073a552573683f82f7762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:19 compute-0 sudo[84630]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:20 compute-0 sshd-session[76973]: Connection closed by 192.168.122.30 port 59406
Nov 25 06:08:20 compute-0 sshd-session[76970]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:08:20 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 25 06:08:20 compute-0 systemd[1]: session-18.scope: Consumed 19.350s CPU time.
Nov 25 06:08:20 compute-0 systemd-logind[744]: Session 18 logged out. Waiting for processes to exit.
Nov 25 06:08:20 compute-0 systemd-logind[744]: Removed session 18.
Nov 25 06:08:25 compute-0 sshd-session[84657]: Accepted publickey for zuul from 192.168.122.30 port 53636 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:08:25 compute-0 systemd-logind[744]: New session 19 of user zuul.
Nov 25 06:08:25 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 25 06:08:25 compute-0 sshd-session[84657]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:08:25 compute-0 python3.9[84810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:08:26 compute-0 sudo[84965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmqrdyvwcvgqxdjwinxsoblqbfacvfcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050906.25395-34-131381832139577/AnsiballZ_file.py'
Nov 25 06:08:26 compute-0 sudo[84965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:26 compute-0 python3.9[84967]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:26 compute-0 sudo[84965]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:27 compute-0 sudo[85117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrsdqzdcsnzkmeamdyalkwjuwuukvtkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050906.8340657-34-23296004338877/AnsiballZ_file.py'
Nov 25 06:08:27 compute-0 sudo[85117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:27 compute-0 python3.9[85119]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:27 compute-0 sudo[85117]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:27 compute-0 python3.9[85269]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:08:28 compute-0 sudo[85419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqpdpnuundflelzjhcjmisphbevrawhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050907.8330243-57-9794084693255/AnsiballZ_seboolean.py'
Nov 25 06:08:28 compute-0 sudo[85419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:28 compute-0 python3.9[85421]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 25 06:08:29 compute-0 sudo[85419]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:29 compute-0 sudo[85575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwdeaqcexorgkybwrgybzxeycyvkdcii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050909.2418394-67-267327793368381/AnsiballZ_setup.py'
Nov 25 06:08:29 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 25 06:08:29 compute-0 sudo[85575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:29 compute-0 python3.9[85577]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:08:29 compute-0 sudo[85575]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:30 compute-0 sudo[85659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sktrocbvuadrospgwqcpaymkususqnvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050909.2418394-67-267327793368381/AnsiballZ_dnf.py'
Nov 25 06:08:30 compute-0 sudo[85659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:30 compute-0 python3.9[85661]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:08:31 compute-0 sudo[85659]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:31 compute-0 sudo[85812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynivbdxwsaxjilsncyangoivfvyuzrwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050911.47019-79-87152820021869/AnsiballZ_systemd.py'
Nov 25 06:08:31 compute-0 sudo[85812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:32 compute-0 python3.9[85814]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:08:32 compute-0 sudo[85812]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:32 compute-0 sudo[85967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulahvkforiixibjgrpdkbrnyspqiqzfv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764050912.3062656-87-221093185108485/AnsiballZ_edpm_nftables_snippet.py'
Nov 25 06:08:32 compute-0 sudo[85967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:32 compute-0 python3[85969]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 25 06:08:32 compute-0 sudo[85967]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:33 compute-0 sudo[86119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrnewnulcgmslbsqtqkzjbvnlhfeqycd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050912.9517875-96-100483598646834/AnsiballZ_file.py'
Nov 25 06:08:33 compute-0 sudo[86119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:33 compute-0 python3.9[86121]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:33 compute-0 sudo[86119]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:33 compute-0 sudo[86271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybzmufxfyrvphgomugchbrxkasuzhwta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050913.4108217-104-267928521305652/AnsiballZ_stat.py'
Nov 25 06:08:33 compute-0 sudo[86271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:33 compute-0 python3.9[86273]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:33 compute-0 sudo[86271]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:34 compute-0 sudo[86349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-valuyxhhqqarduygtjvfnuupffymlyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050913.4108217-104-267928521305652/AnsiballZ_file.py'
Nov 25 06:08:34 compute-0 sudo[86349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:34 compute-0 python3.9[86351]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:34 compute-0 sudo[86349]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:34 compute-0 sudo[86501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nltfzqvkvjeuseibtywohybilglwuahs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050914.2981346-116-147940469846737/AnsiballZ_stat.py'
Nov 25 06:08:34 compute-0 sudo[86501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:34 compute-0 python3.9[86503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:34 compute-0 sudo[86501]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:34 compute-0 sudo[86579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjolemeyenhsupbbxddrhsdeyarbcwpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050914.2981346-116-147940469846737/AnsiballZ_file.py'
Nov 25 06:08:34 compute-0 sudo[86579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:34 compute-0 python3.9[86581]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xzgve2ma recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:34 compute-0 sudo[86579]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:35 compute-0 sudo[86731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svqwdwiigmqjamzvokjhdzkwehaspxor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050915.070253-128-91276211564959/AnsiballZ_stat.py'
Nov 25 06:08:35 compute-0 sudo[86731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:35 compute-0 python3.9[86733]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:35 compute-0 sudo[86731]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:35 compute-0 sudo[86809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pacyokocppmmexqcymjldawaahhcvwge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050915.070253-128-91276211564959/AnsiballZ_file.py'
Nov 25 06:08:35 compute-0 sudo[86809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:35 compute-0 python3.9[86811]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:35 compute-0 sudo[86809]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:36 compute-0 sudo[86961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egewakyzokyuxhmczgxffmyyjyvcmjph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050915.8693056-141-55527635942904/AnsiballZ_command.py'
Nov 25 06:08:36 compute-0 sudo[86961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:36 compute-0 python3.9[86963]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:08:36 compute-0 sudo[86961]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:36 compute-0 sudo[87114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oznohwqlndsgwapnelmtscmgqwwtdacc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764050916.5461116-149-92758506280693/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 06:08:36 compute-0 sudo[87114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:36 compute-0 python3[87116]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 06:08:37 compute-0 sudo[87114]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:37 compute-0 sudo[87266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waqabrbzadelwlztvgjiqjzzqgjihtyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050917.136373-157-255672549812256/AnsiballZ_stat.py'
Nov 25 06:08:37 compute-0 sudo[87266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:37 compute-0 python3.9[87268]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:37 compute-0 sudo[87266]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:37 compute-0 sudo[87391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xigzqfigdeyamcznfqlfdashkbdofhhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050917.136373-157-255672549812256/AnsiballZ_copy.py'
Nov 25 06:08:37 compute-0 sudo[87391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:38 compute-0 python3.9[87393]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050917.136373-157-255672549812256/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:38 compute-0 sudo[87391]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:38 compute-0 sudo[87543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bawhayhgykdbclaogvhfpveofiyltsxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050918.1357548-172-120260944377351/AnsiballZ_stat.py'
Nov 25 06:08:38 compute-0 sudo[87543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:38 compute-0 python3.9[87545]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:38 compute-0 sudo[87543]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:38 compute-0 sudo[87668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyxpaxfpacpxshpkpoxsiyktopphnbwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050918.1357548-172-120260944377351/AnsiballZ_copy.py'
Nov 25 06:08:38 compute-0 sudo[87668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:38 compute-0 python3.9[87670]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050918.1357548-172-120260944377351/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:38 compute-0 sudo[87668]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:39 compute-0 sudo[87820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkljzrcuoolctfratwocilylnsdkyymx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050918.9983547-187-158314164981245/AnsiballZ_stat.py'
Nov 25 06:08:39 compute-0 sudo[87820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:39 compute-0 python3.9[87822]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:39 compute-0 sudo[87820]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:39 compute-0 sudo[87945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbmwsyaqocllqvchvmbgwbhvywfpudhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050918.9983547-187-158314164981245/AnsiballZ_copy.py'
Nov 25 06:08:39 compute-0 sudo[87945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:39 compute-0 python3.9[87947]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050918.9983547-187-158314164981245/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:39 compute-0 sudo[87945]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:40 compute-0 sudo[88097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aptmjlmuxwrkjzrfkvjsbyszxclutvae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050919.8624315-202-14567805806357/AnsiballZ_stat.py'
Nov 25 06:08:40 compute-0 sudo[88097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:40 compute-0 python3.9[88099]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:40 compute-0 sudo[88097]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:40 compute-0 sudo[88222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rppqsucimggkzfgvfogymblhnlawyurn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050919.8624315-202-14567805806357/AnsiballZ_copy.py'
Nov 25 06:08:40 compute-0 sudo[88222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:40 compute-0 python3.9[88224]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050919.8624315-202-14567805806357/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:40 compute-0 sudo[88222]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:40 compute-0 sudo[88374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onawertecbfacpvbcesjybebjbdtykxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050920.6985095-217-9510618585740/AnsiballZ_stat.py'
Nov 25 06:08:40 compute-0 sudo[88374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:41 compute-0 python3.9[88376]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:41 compute-0 sudo[88374]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:41 compute-0 sudo[88499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpefszpclfjxfrwzgiohekvxrgswfbih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050920.6985095-217-9510618585740/AnsiballZ_copy.py'
Nov 25 06:08:41 compute-0 sudo[88499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:41 compute-0 python3.9[88501]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764050920.6985095-217-9510618585740/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:41 compute-0 sudo[88499]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:41 compute-0 sudo[88651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdosmzkzkxxmdewckkwjfwwvanckftdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050921.5991678-232-156283283421978/AnsiballZ_file.py'
Nov 25 06:08:41 compute-0 sudo[88651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:41 compute-0 python3.9[88653]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:41 compute-0 sudo[88651]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:42 compute-0 sudo[88803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxplsndvoljrrbhwovhdrfvyzzmfghnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050922.059714-240-191088088604247/AnsiballZ_command.py'
Nov 25 06:08:42 compute-0 sudo[88803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:42 compute-0 python3.9[88805]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:08:42 compute-0 sudo[88803]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:42 compute-0 sudo[88958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxhuovgspajmobglfzstwuvrusdnhxvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050922.5271297-248-159913532305728/AnsiballZ_blockinfile.py'
Nov 25 06:08:42 compute-0 sudo[88958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:42 compute-0 python3.9[88960]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:43 compute-0 sudo[88958]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:43 compute-0 sudo[89110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucsxapwctvviwhhvazqbuqsoxojizovy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050923.1649613-257-103887683841515/AnsiballZ_command.py'
Nov 25 06:08:43 compute-0 sudo[89110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:43 compute-0 python3.9[89112]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:08:43 compute-0 sudo[89110]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:43 compute-0 sudo[89263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcxsydhcxikhbsjoojcvqggkfgtuuimg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050923.638852-265-187849400016160/AnsiballZ_stat.py'
Nov 25 06:08:43 compute-0 sudo[89263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:43 compute-0 python3.9[89265]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:08:44 compute-0 sudo[89263]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:44 compute-0 sudo[89417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsfwxelolufbrpvdfjfuxwpcilywercs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050924.12621-273-100090535438993/AnsiballZ_command.py'
Nov 25 06:08:44 compute-0 sudo[89417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:44 compute-0 python3.9[89419]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:08:44 compute-0 sudo[89417]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:44 compute-0 sudo[89572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frexkvllomnncbukrpikxmrcbcefdthq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050924.5968282-281-271092147479686/AnsiballZ_file.py'
Nov 25 06:08:44 compute-0 sudo[89572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:44 compute-0 python3.9[89574]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:44 compute-0 sudo[89572]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:45 compute-0 python3.9[89724]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:08:46 compute-0 sudo[89875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikigfasnarmrfczqlosbzaiuerjoidik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050926.227234-321-89973763690798/AnsiballZ_command.py'
Nov 25 06:08:46 compute-0 sudo[89875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:46 compute-0 python3.9[89877]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:08:46 compute-0 ovs-vsctl[89878]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 25 06:08:46 compute-0 sudo[89875]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:46 compute-0 sudo[90028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rffbtiiibdodnskruthfjqgbuihgmhau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050926.710854-330-228310377417335/AnsiballZ_command.py'
Nov 25 06:08:46 compute-0 sudo[90028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:47 compute-0 python3.9[90030]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:08:47 compute-0 sudo[90028]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:47 compute-0 sudo[90183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzszydcajrmykjbsdrphypefoyaadqfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050927.1556363-338-117432274762030/AnsiballZ_command.py'
Nov 25 06:08:47 compute-0 sudo[90183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:47 compute-0 python3.9[90185]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:08:47 compute-0 ovs-vsctl[90186]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 25 06:08:47 compute-0 sudo[90183]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:47 compute-0 python3.9[90336]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:08:48 compute-0 sudo[90488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjdffxukhvrwqqcbpicosytwhctpporu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050928.1023118-355-18431573423438/AnsiballZ_file.py'
Nov 25 06:08:48 compute-0 sudo[90488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:48 compute-0 python3.9[90490]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:48 compute-0 sudo[90488]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:48 compute-0 sudo[90640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwwhgkqmxueveuofwchamlmfkcuhiary ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050928.5507355-363-97324327660001/AnsiballZ_stat.py'
Nov 25 06:08:48 compute-0 sudo[90640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:48 compute-0 python3.9[90642]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:48 compute-0 sudo[90640]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:48 compute-0 chronyd[64687]: Selected source 23.186.168.131 (pool.ntp.org)
Nov 25 06:08:49 compute-0 sudo[90718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urhwlcurinsljoaeuwpjutpsilrgaayv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050928.5507355-363-97324327660001/AnsiballZ_file.py'
Nov 25 06:08:49 compute-0 sudo[90718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:49 compute-0 python3.9[90720]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:49 compute-0 sudo[90718]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:49 compute-0 sudo[90870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeipaggwpzctsvmyzedqmnehfgdvsguh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050929.2777534-363-23320462424508/AnsiballZ_stat.py'
Nov 25 06:08:49 compute-0 sudo[90870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:49 compute-0 python3.9[90872]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:49 compute-0 sudo[90870]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:49 compute-0 sudo[90948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-covoslucwxnhyeymvcpwinzlbkdjbbrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050929.2777534-363-23320462424508/AnsiballZ_file.py'
Nov 25 06:08:49 compute-0 sudo[90948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:50 compute-0 python3.9[90950]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:50 compute-0 sudo[90948]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:50 compute-0 sudo[91100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gquvmzpyaauwahwsqfclwogjvnuafpyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050930.2201846-386-9663712533809/AnsiballZ_file.py'
Nov 25 06:08:50 compute-0 sudo[91100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:50 compute-0 python3.9[91102]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:50 compute-0 sudo[91100]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:50 compute-0 sudo[91252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfumwxqajiskorpxrwkmfnqoybxwgmip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050930.6810944-394-277641341339002/AnsiballZ_stat.py'
Nov 25 06:08:50 compute-0 sudo[91252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:51 compute-0 python3.9[91254]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:51 compute-0 sudo[91252]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:51 compute-0 sudo[91330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuydrdpnfzfdcjsjihkahoylcgxcjghv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050930.6810944-394-277641341339002/AnsiballZ_file.py'
Nov 25 06:08:51 compute-0 sudo[91330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:51 compute-0 python3.9[91332]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:51 compute-0 sudo[91330]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:51 compute-0 sudo[91482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lawukniozmmnvdicmtznzmcpiiyabclg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050931.4768739-406-182267122068994/AnsiballZ_stat.py'
Nov 25 06:08:51 compute-0 sudo[91482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:51 compute-0 python3.9[91484]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:51 compute-0 sudo[91482]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:51 compute-0 sudo[91560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-folvmfsththlumxbiieqihflsmddciok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050931.4768739-406-182267122068994/AnsiballZ_file.py'
Nov 25 06:08:51 compute-0 sudo[91560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:52 compute-0 python3.9[91562]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:52 compute-0 sudo[91560]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:52 compute-0 sudo[91712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlzcqfvofpbhmonlfnbunwfibwxgxzdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050932.265099-418-38754716900562/AnsiballZ_systemd.py'
Nov 25 06:08:52 compute-0 sudo[91712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:52 compute-0 python3.9[91714]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:08:52 compute-0 systemd[1]: Reloading.
Nov 25 06:08:52 compute-0 systemd-rc-local-generator[91738]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:08:52 compute-0 systemd-sysv-generator[91742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:08:52 compute-0 sudo[91712]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:53 compute-0 sudo[91901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pteqpkzgzolctyjhzxmxksnlglgvrdht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050933.036512-426-98952624340234/AnsiballZ_stat.py'
Nov 25 06:08:53 compute-0 sudo[91901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:53 compute-0 python3.9[91903]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:53 compute-0 sudo[91901]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:53 compute-0 sudo[91979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqsucnndqrnojdtadiaoggjixijpnaht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050933.036512-426-98952624340234/AnsiballZ_file.py'
Nov 25 06:08:53 compute-0 sudo[91979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:53 compute-0 python3.9[91981]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:53 compute-0 sudo[91979]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:54 compute-0 sudo[92131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfvszqusoqgtjwihlpmxxhmcnlbfyycp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050933.8294547-438-217099557108335/AnsiballZ_stat.py'
Nov 25 06:08:54 compute-0 sudo[92131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:54 compute-0 python3.9[92133]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:54 compute-0 sudo[92131]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:54 compute-0 sudo[92209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbkviehyttqrnlkdzqmtnaftspirljxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050933.8294547-438-217099557108335/AnsiballZ_file.py'
Nov 25 06:08:54 compute-0 sudo[92209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:54 compute-0 python3.9[92211]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:54 compute-0 sudo[92209]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:54 compute-0 sudo[92361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atdmztgoqgfeytlennvphgiaumlnvslm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050934.6981807-450-58671472097192/AnsiballZ_systemd.py'
Nov 25 06:08:54 compute-0 sudo[92361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:55 compute-0 python3.9[92363]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:08:55 compute-0 systemd[1]: Reloading.
Nov 25 06:08:55 compute-0 systemd-rc-local-generator[92385]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:08:55 compute-0 systemd-sysv-generator[92388]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:08:55 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 06:08:55 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 06:08:55 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 06:08:55 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 06:08:55 compute-0 sudo[92361]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:55 compute-0 sudo[92555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gszafnbxfskhkvjsbaepxguiprugeykx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050935.5581057-460-46810723368354/AnsiballZ_file.py'
Nov 25 06:08:55 compute-0 sudo[92555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:55 compute-0 python3.9[92557]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:55 compute-0 sudo[92555]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:56 compute-0 sudo[92707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eipuacnvxpzrwfderixslqhmhpzrpiid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050936.0384188-468-80977213324133/AnsiballZ_stat.py'
Nov 25 06:08:56 compute-0 sudo[92707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:56 compute-0 python3.9[92709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:56 compute-0 sudo[92707]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:56 compute-0 sudo[92830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxdwfhnhnqryuuaoeursfkllkgjiteac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050936.0384188-468-80977213324133/AnsiballZ_copy.py'
Nov 25 06:08:56 compute-0 sudo[92830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:56 compute-0 python3.9[92832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050936.0384188-468-80977213324133/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:56 compute-0 sudo[92830]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:57 compute-0 sudo[92982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbhtyyespvsxtpjrorjkligicjlmnqiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050937.0915058-485-180163065845961/AnsiballZ_file.py'
Nov 25 06:08:57 compute-0 sudo[92982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:57 compute-0 python3.9[92984]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:08:57 compute-0 sudo[92982]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:57 compute-0 sudo[93134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjswdnyqifwzzsefwpychlrqepcxpnix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050937.5771189-493-255062222232960/AnsiballZ_stat.py'
Nov 25 06:08:57 compute-0 sudo[93134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:57 compute-0 python3.9[93136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:08:57 compute-0 sudo[93134]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:58 compute-0 sudo[93257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frjrfioslgzngcgwzfdnppaqfrevdfyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050937.5771189-493-255062222232960/AnsiballZ_copy.py'
Nov 25 06:08:58 compute-0 sudo[93257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:58 compute-0 python3.9[93259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050937.5771189-493-255062222232960/.source.json _original_basename=.s0li_78i follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:58 compute-0 sudo[93257]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:58 compute-0 sudo[93409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqnetefxjotpkqkgqajaefasanbkblid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050938.5669289-508-7617278773304/AnsiballZ_file.py'
Nov 25 06:08:58 compute-0 sudo[93409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:58 compute-0 python3.9[93411]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:08:58 compute-0 sudo[93409]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:59 compute-0 sudo[93561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugydjkucouqcghwmjuznivuwuxlgiobr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050939.0660625-516-231008616410337/AnsiballZ_stat.py'
Nov 25 06:08:59 compute-0 sudo[93561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:59 compute-0 sudo[93561]: pam_unix(sudo:session): session closed for user root
Nov 25 06:08:59 compute-0 sudo[93684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjbvwfgligjkhwkjutfjjoitqrpzdyeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050939.0660625-516-231008616410337/AnsiballZ_copy.py'
Nov 25 06:08:59 compute-0 sudo[93684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:08:59 compute-0 sudo[93684]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:00 compute-0 sudo[93836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smqavwgelmtzsasgvmyatjypsjjmhrvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050940.0265725-533-6391072700483/AnsiballZ_container_config_data.py'
Nov 25 06:09:00 compute-0 sudo[93836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:00 compute-0 python3.9[93838]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 25 06:09:00 compute-0 sudo[93836]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:00 compute-0 sudo[93988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfzyphndcyzgmurlfyqdjhumvalwrmop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050940.6414247-542-126601140187667/AnsiballZ_container_config_hash.py'
Nov 25 06:09:00 compute-0 sudo[93988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:01 compute-0 python3.9[93990]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:09:01 compute-0 sudo[93988]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:01 compute-0 sudo[94140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gayifgptlqgfsusrhooujtjslwjhswax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050941.2946932-551-248408247854684/AnsiballZ_podman_container_info.py'
Nov 25 06:09:01 compute-0 sudo[94140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:01 compute-0 python3.9[94142]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 06:09:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:09:01 compute-0 sudo[94140]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:02 compute-0 sudo[94302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qflbqhzulbkghgwdgqlfzqfgansgtunj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764050942.149905-564-74042100753638/AnsiballZ_edpm_container_manage.py'
Nov 25 06:09:02 compute-0 sudo[94302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:02 compute-0 python3[94304]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:09:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:09:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:09:02 compute-0 podman[94333]: 2025-11-25 06:09:02.826166158 +0000 UTC m=+0.027833503 container create f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:09:02 compute-0 podman[94333]: 2025-11-25 06:09:02.813337202 +0000 UTC m=+0.015004538 image pull fb385c849c98a3c678a3d627f4cb894eda21a9dce6ba3cc1ef408e332ab6bee7 quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:09:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:09:02 compute-0 python3[94304]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:09:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 06:09:02 compute-0 sudo[94302]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:03 compute-0 sudo[94512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngglkkhfjhakmmgwmxzfqmsjwpxbwmxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050943.0355003-572-100031654129445/AnsiballZ_stat.py'
Nov 25 06:09:03 compute-0 sudo[94512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:03 compute-0 python3.9[94514]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:09:03 compute-0 sudo[94512]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:03 compute-0 sudo[94666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfharadfwiysjuiviakemaughzepsnht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050943.5820916-581-104924373067138/AnsiballZ_file.py'
Nov 25 06:09:03 compute-0 sudo[94666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:03 compute-0 python3.9[94668]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:03 compute-0 sudo[94666]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:04 compute-0 sudo[94742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqwbeajmqpkcqsggokjitknzaetqisfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050943.5820916-581-104924373067138/AnsiballZ_stat.py'
Nov 25 06:09:04 compute-0 sudo[94742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:04 compute-0 python3.9[94744]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:09:04 compute-0 sudo[94742]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:04 compute-0 sudo[94893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffozbhbftvjbazbfcshxsdacraopftbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050944.2681139-581-180077907934130/AnsiballZ_copy.py'
Nov 25 06:09:04 compute-0 sudo[94893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:04 compute-0 python3.9[94895]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764050944.2681139-581-180077907934130/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:04 compute-0 sudo[94893]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:04 compute-0 sudo[94969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqkcncnlnoyidfdebgfibabwlegnaame ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050944.2681139-581-180077907934130/AnsiballZ_systemd.py'
Nov 25 06:09:04 compute-0 sudo[94969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:05 compute-0 python3.9[94971]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:09:05 compute-0 systemd[1]: Reloading.
Nov 25 06:09:05 compute-0 systemd-sysv-generator[94997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:05 compute-0 systemd-rc-local-generator[94994]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:05 compute-0 sudo[94969]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:05 compute-0 sudo[95080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvmvbpgalvefuarvtvuxrmzisxqssbpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050944.2681139-581-180077907934130/AnsiballZ_systemd.py'
Nov 25 06:09:05 compute-0 sudo[95080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:05 compute-0 python3.9[95082]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:05 compute-0 systemd[1]: Reloading.
Nov 25 06:09:05 compute-0 systemd-rc-local-generator[95107]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:05 compute-0 systemd-sysv-generator[95111]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:05 compute-0 systemd[1]: Starting ovn_controller container...
Nov 25 06:09:06 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 25 06:09:06 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:09:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3061c4acf9442e51aac1d4f9ed464b968abc67816313583fc29adeea528f3dd7/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 06:09:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54.
Nov 25 06:09:06 compute-0 podman[95123]: 2025-11-25 06:09:06.059921828 +0000 UTC m=+0.085871948 container init f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller)
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + sudo -E kolla_set_configs
Nov 25 06:09:06 compute-0 podman[95123]: 2025-11-25 06:09:06.08131549 +0000 UTC m=+0.107265590 container start f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 06:09:06 compute-0 edpm-start-podman-container[95123]: ovn_controller
Nov 25 06:09:06 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 25 06:09:06 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 25 06:09:06 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 25 06:09:06 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 25 06:09:06 compute-0 systemd[95168]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 25 06:09:06 compute-0 edpm-start-podman-container[95122]: Creating additional drop-in dependency for "ovn_controller" (f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54)
Nov 25 06:09:06 compute-0 podman[95142]: 2025-11-25 06:09:06.137951851 +0000 UTC m=+0.047465633 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 06:09:06 compute-0 systemd[1]: f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54-7042dc74f0d1d779.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 06:09:06 compute-0 systemd[1]: f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54-7042dc74f0d1d779.service: Failed with result 'exit-code'.
Nov 25 06:09:06 compute-0 systemd[1]: Reloading.
Nov 25 06:09:06 compute-0 systemd[95168]: Queued start job for default target Main User Target.
Nov 25 06:09:06 compute-0 systemd[95168]: Created slice User Application Slice.
Nov 25 06:09:06 compute-0 systemd[95168]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 25 06:09:06 compute-0 systemd[95168]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 06:09:06 compute-0 systemd[95168]: Reached target Paths.
Nov 25 06:09:06 compute-0 systemd[95168]: Reached target Timers.
Nov 25 06:09:06 compute-0 systemd[95168]: Starting D-Bus User Message Bus Socket...
Nov 25 06:09:06 compute-0 systemd[95168]: Starting Create User's Volatile Files and Directories...
Nov 25 06:09:06 compute-0 systemd-rc-local-generator[95213]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:06 compute-0 systemd[95168]: Finished Create User's Volatile Files and Directories.
Nov 25 06:09:06 compute-0 systemd-sysv-generator[95216]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:06 compute-0 systemd[95168]: Listening on D-Bus User Message Bus Socket.
Nov 25 06:09:06 compute-0 systemd[95168]: Reached target Sockets.
Nov 25 06:09:06 compute-0 systemd[95168]: Reached target Basic System.
Nov 25 06:09:06 compute-0 systemd[95168]: Reached target Main User Target.
Nov 25 06:09:06 compute-0 systemd[95168]: Startup finished in 97ms.
Nov 25 06:09:06 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 25 06:09:06 compute-0 systemd[1]: Started ovn_controller container.
Nov 25 06:09:06 compute-0 systemd[1]: Started Session c1 of User root.
Nov 25 06:09:06 compute-0 sudo[95080]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:06 compute-0 ovn_controller[95135]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:09:06 compute-0 ovn_controller[95135]: INFO:__main__:Validating config file
Nov 25 06:09:06 compute-0 ovn_controller[95135]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:09:06 compute-0 ovn_controller[95135]: INFO:__main__:Writing out command to execute
Nov 25 06:09:06 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 25 06:09:06 compute-0 ovn_controller[95135]: ++ cat /run_command
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + ARGS=
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + sudo kolla_copy_cacerts
Nov 25 06:09:06 compute-0 systemd[1]: Started Session c2 of User root.
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + [[ ! -n '' ]]
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + . kolla_extend_start
Nov 25 06:09:06 compute-0 ovn_controller[95135]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + umask 0022
Nov 25 06:09:06 compute-0 ovn_controller[95135]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 25 06:09:06 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Nov 25 06:09:06 compute-0 ovn_controller[95135]: 2025-11-25T06:09:06Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Nov 25 06:09:06 compute-0 NetworkManager[55345]: <info>  [1764050946.4645] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 25 06:09:06 compute-0 NetworkManager[55345]: <info>  [1764050946.4650] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:09:06 compute-0 NetworkManager[55345]: <info>  [1764050946.4657] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 25 06:09:06 compute-0 NetworkManager[55345]: <info>  [1764050946.4661] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 25 06:09:06 compute-0 NetworkManager[55345]: <info>  [1764050946.4664] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 25 06:09:06 compute-0 kernel: br-int: entered promiscuous mode
Nov 25 06:09:06 compute-0 systemd-udevd[95289]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:09:06 compute-0 sudo[95394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evktupxuvresxjdpxcwzommifrgzazjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050946.4966729-609-278949561638674/AnsiballZ_command.py'
Nov 25 06:09:06 compute-0 sudo[95394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:06 compute-0 python3.9[95396]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:09:06 compute-0 ovs-vsctl[95397]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 25 06:09:06 compute-0 sudo[95394]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:07 compute-0 sudo[95547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdsjiatjkfentwgnuufbxoufiaqtssif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050946.9780228-617-43246882414347/AnsiballZ_command.py'
Nov 25 06:09:07 compute-0 sudo[95547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:07 compute-0 python3.9[95549]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:09:07 compute-0 ovs-vsctl[95551]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 25 06:09:07 compute-0 sudo[95547]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00021|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00022|features|INFO|OVS Feature: meter_support, state: supported
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00023|features|INFO|OVS Feature: group_support, state: supported
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00024|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00025|features|INFO|OVS Feature: ct_flush, state: supported
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00026|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00027|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00028|main|INFO|OVS feature set changed, force recompute.
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00029|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00030|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00032|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00033|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00034|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00035|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 25 06:09:07 compute-0 ovn_controller[95135]: 2025-11-25T06:09:07Z|00036|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 25 06:09:07 compute-0 NetworkManager[55345]: <info>  [1764050947.4804] manager: (ovn-eb580c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 25 06:09:07 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 25 06:09:07 compute-0 NetworkManager[55345]: <info>  [1764050947.4939] device (genev_sys_6081): carrier: link connected
Nov 25 06:09:07 compute-0 NetworkManager[55345]: <info>  [1764050947.4941] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 25 06:09:07 compute-0 sudo[95705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqapmchxoprrpldpldzwhqtbnvakajyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050947.6439989-631-78926903172942/AnsiballZ_command.py'
Nov 25 06:09:07 compute-0 sudo[95705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:07 compute-0 python3.9[95707]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:09:07 compute-0 ovs-vsctl[95708]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 25 06:09:07 compute-0 sudo[95705]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:08 compute-0 sshd-session[84660]: Connection closed by 192.168.122.30 port 53636
Nov 25 06:09:08 compute-0 sshd-session[84657]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:09:08 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 25 06:09:08 compute-0 systemd[1]: session-19.scope: Consumed 30.759s CPU time.
Nov 25 06:09:08 compute-0 systemd-logind[744]: Session 19 logged out. Waiting for processes to exit.
Nov 25 06:09:08 compute-0 systemd-logind[744]: Removed session 19.
Nov 25 06:09:13 compute-0 sshd-session[95733]: Accepted publickey for zuul from 192.168.122.30 port 57140 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:09:13 compute-0 systemd-logind[744]: New session 21 of user zuul.
Nov 25 06:09:13 compute-0 systemd[1]: Started Session 21 of User zuul.
Nov 25 06:09:13 compute-0 sshd-session[95733]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:09:14 compute-0 python3.9[95886]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:09:15 compute-0 sudo[96040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apcikogaxbxsyaajhrywcwsqhpjumhsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050954.7348216-34-209361228274562/AnsiballZ_file.py'
Nov 25 06:09:15 compute-0 sudo[96040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:15 compute-0 python3.9[96042]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:15 compute-0 sudo[96040]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:15 compute-0 sudo[96192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbeoxrouxyaljbrcgbdufqilvcyqynnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050955.3080494-34-112125132968064/AnsiballZ_file.py'
Nov 25 06:09:15 compute-0 sudo[96192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:15 compute-0 python3.9[96194]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:15 compute-0 sudo[96192]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:15 compute-0 sudo[96344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvcxijrzfmybdpahhgqtzzstszfhidtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050955.7291625-34-162472549706836/AnsiballZ_file.py'
Nov 25 06:09:15 compute-0 sudo[96344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:16 compute-0 python3.9[96346]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:16 compute-0 sudo[96344]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:16 compute-0 sudo[96496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsponqugogvicugteugjwmotxisdqgci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050956.2524705-34-198718582764539/AnsiballZ_file.py'
Nov 25 06:09:16 compute-0 sudo[96496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:16 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 25 06:09:16 compute-0 systemd[95168]: Activating special unit Exit the Session...
Nov 25 06:09:16 compute-0 systemd[95168]: Stopped target Main User Target.
Nov 25 06:09:16 compute-0 systemd[95168]: Stopped target Basic System.
Nov 25 06:09:16 compute-0 systemd[95168]: Stopped target Paths.
Nov 25 06:09:16 compute-0 systemd[95168]: Stopped target Sockets.
Nov 25 06:09:16 compute-0 systemd[95168]: Stopped target Timers.
Nov 25 06:09:16 compute-0 systemd[95168]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 06:09:16 compute-0 systemd[95168]: Closed D-Bus User Message Bus Socket.
Nov 25 06:09:16 compute-0 systemd[95168]: Stopped Create User's Volatile Files and Directories.
Nov 25 06:09:16 compute-0 systemd[95168]: Removed slice User Application Slice.
Nov 25 06:09:16 compute-0 systemd[95168]: Reached target Shutdown.
Nov 25 06:09:16 compute-0 systemd[95168]: Finished Exit the Session.
Nov 25 06:09:16 compute-0 systemd[95168]: Reached target Exit the Session.
Nov 25 06:09:16 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 25 06:09:16 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 25 06:09:16 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 25 06:09:16 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 25 06:09:16 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 25 06:09:16 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 25 06:09:16 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 25 06:09:16 compute-0 python3.9[96498]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:16 compute-0 sudo[96496]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:16 compute-0 sudo[96650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuaydtxxdzdodfvhwlxxwtzgtvdwceep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050956.672169-34-17719921876716/AnsiballZ_file.py'
Nov 25 06:09:16 compute-0 sudo[96650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:16 compute-0 python3.9[96652]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:17 compute-0 sudo[96650]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:17 compute-0 python3.9[96802]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:09:17 compute-0 sudo[96952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvawkglwfdwfegajvoercqoflechkldv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050957.6710138-78-40723118577799/AnsiballZ_seboolean.py'
Nov 25 06:09:17 compute-0 sudo[96952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:18 compute-0 python3.9[96954]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 25 06:09:18 compute-0 sudo[96952]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:19 compute-0 python3.9[97104]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:19 compute-0 python3.9[97225]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050958.7420785-86-96517958743853/.source follow=False _original_basename=haproxy.j2 checksum=bc1b563e515216c8ed64eb57c634e925f6fcbb17 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:20 compute-0 python3.9[97375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:20 compute-0 python3.9[97496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050959.7941418-101-101099329374795/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:20 compute-0 sudo[97646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljrdgythdrlcazvyrcqeatavaagspsvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050960.6918795-118-39286175190990/AnsiballZ_setup.py'
Nov 25 06:09:21 compute-0 sudo[97646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:21 compute-0 python3.9[97648]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:09:21 compute-0 sudo[97646]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:21 compute-0 sudo[97730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfbqrvomsjusximmpaxecbqnykzkolno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050960.6918795-118-39286175190990/AnsiballZ_dnf.py'
Nov 25 06:09:21 compute-0 sudo[97730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:21 compute-0 python3.9[97732]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:09:22 compute-0 sudo[97730]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:23 compute-0 sudo[97883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pstbwkaovabpuzpwrqoosqrgvqwziksx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050963.0171745-130-71352735659687/AnsiballZ_systemd.py'
Nov 25 06:09:23 compute-0 sudo[97883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:23 compute-0 python3.9[97885]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:09:23 compute-0 sudo[97883]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:24 compute-0 python3.9[98038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:24 compute-0 python3.9[98159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050963.858205-138-168682883512596/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:24 compute-0 python3.9[98309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:25 compute-0 python3.9[98430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050964.6244152-138-93416969785362/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:26 compute-0 python3.9[98581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:26 compute-0 python3.9[98702]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050965.9280698-182-147329851520777/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:26 compute-0 python3.9[98852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:27 compute-0 python3.9[98973]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050966.693464-182-146416836686642/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:27 compute-0 python3.9[99123]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:09:28 compute-0 sudo[99275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-devqkviyvtparcvkyefdqfctzqasxlbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050967.9361525-220-156896307353951/AnsiballZ_file.py'
Nov 25 06:09:28 compute-0 sudo[99275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:28 compute-0 python3.9[99277]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:28 compute-0 sudo[99275]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:28 compute-0 sudo[99427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkywpqzfzmrxylwsiopsbdwisyghiwty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050968.392506-228-17704546429531/AnsiballZ_stat.py'
Nov 25 06:09:28 compute-0 sudo[99427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:28 compute-0 python3.9[99429]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:28 compute-0 sudo[99427]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:28 compute-0 sudo[99505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clfzmcbpkbmabmkpkdhrwqzylumobpwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050968.392506-228-17704546429531/AnsiballZ_file.py'
Nov 25 06:09:28 compute-0 sudo[99505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:29 compute-0 python3.9[99507]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:29 compute-0 sudo[99505]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:29 compute-0 sudo[99657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhsxmmjpuwfrjfwxxbbqlbgocttsphjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050969.1317282-228-200114579628808/AnsiballZ_stat.py'
Nov 25 06:09:29 compute-0 sudo[99657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:29 compute-0 python3.9[99659]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:29 compute-0 sudo[99657]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:29 compute-0 sudo[99735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezsylqgpjypkizyvdjpbmxekjxlclxzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050969.1317282-228-200114579628808/AnsiballZ_file.py'
Nov 25 06:09:29 compute-0 sudo[99735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:29 compute-0 python3.9[99737]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:29 compute-0 sudo[99735]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:30 compute-0 sudo[99887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-levtoipfxzmcpmwovwtmxslwzosdtdly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050969.8941998-251-277207534579389/AnsiballZ_file.py'
Nov 25 06:09:30 compute-0 sudo[99887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:30 compute-0 python3.9[99889]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:30 compute-0 sudo[99887]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:30 compute-0 sudo[100039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ectsvnujfskqlqjrvjydyjrcqkofxpht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050970.3451934-259-246444665388286/AnsiballZ_stat.py'
Nov 25 06:09:30 compute-0 sudo[100039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:30 compute-0 python3.9[100041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:30 compute-0 sudo[100039]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:30 compute-0 sudo[100117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eystuwiosobewhgenqoszafmccepvnsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050970.3451934-259-246444665388286/AnsiballZ_file.py'
Nov 25 06:09:30 compute-0 sudo[100117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:30 compute-0 python3.9[100119]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:31 compute-0 sudo[100117]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:31 compute-0 sudo[100269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqzsqlntbggskyqiqglaqlgmdhujeaeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050971.1232224-271-172017748110594/AnsiballZ_stat.py'
Nov 25 06:09:31 compute-0 sudo[100269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:31 compute-0 python3.9[100271]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:31 compute-0 sudo[100269]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:31 compute-0 sudo[100347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrodyvsvaeendxzvskxeazddhowrzsde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050971.1232224-271-172017748110594/AnsiballZ_file.py'
Nov 25 06:09:31 compute-0 sudo[100347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:31 compute-0 python3.9[100349]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:31 compute-0 sudo[100347]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:32 compute-0 sudo[100499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yauojzkfozedyybhiqabnafdeyfaqvkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050971.8859482-283-11912139759/AnsiballZ_systemd.py'
Nov 25 06:09:32 compute-0 sudo[100499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:32 compute-0 python3.9[100501]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:32 compute-0 systemd[1]: Reloading.
Nov 25 06:09:32 compute-0 systemd-rc-local-generator[100522]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:32 compute-0 systemd-sysv-generator[100525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:32 compute-0 sudo[100499]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:32 compute-0 sudo[100688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvmlrfyuhraerofjezlffuzhfmiqrlld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050972.6669724-291-58503461935711/AnsiballZ_stat.py'
Nov 25 06:09:32 compute-0 sudo[100688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:32 compute-0 python3.9[100690]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:33 compute-0 sudo[100688]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:33 compute-0 sudo[100766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvzgtxxdiwhdfckazdurolcjmvhrmsul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050972.6669724-291-58503461935711/AnsiballZ_file.py'
Nov 25 06:09:33 compute-0 sudo[100766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:33 compute-0 python3.9[100768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:33 compute-0 sudo[100766]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:33 compute-0 sudo[100918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdulqpswgavulkoefqfshsfvbstjescq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050973.4556446-303-52733624307506/AnsiballZ_stat.py'
Nov 25 06:09:33 compute-0 sudo[100918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:33 compute-0 python3.9[100920]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:33 compute-0 sudo[100918]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:33 compute-0 sudo[100996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiffbyxiqhuuyjqomidnulemabtefpac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050973.4556446-303-52733624307506/AnsiballZ_file.py'
Nov 25 06:09:33 compute-0 sudo[100996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:34 compute-0 python3.9[100998]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:34 compute-0 sudo[100996]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:34 compute-0 sudo[101148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wscxephsvetzizthjttanuijkzsxzmzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050974.2474434-315-230445676042238/AnsiballZ_systemd.py'
Nov 25 06:09:34 compute-0 sudo[101148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:34 compute-0 python3.9[101150]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:34 compute-0 systemd[1]: Reloading.
Nov 25 06:09:34 compute-0 systemd-rc-local-generator[101171]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:34 compute-0 systemd-sysv-generator[101174]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:34 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 06:09:34 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 06:09:34 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 06:09:34 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 06:09:34 compute-0 sudo[101148]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:35 compute-0 sudo[101341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyfenlklxdyqsqvypjzyaiussyytqnvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050975.0978026-325-44779910929296/AnsiballZ_file.py'
Nov 25 06:09:35 compute-0 sudo[101341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:35 compute-0 python3.9[101343]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:35 compute-0 sudo[101341]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:35 compute-0 sudo[101493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yusucmfgnsnlrtsdcyugmhrortqwqvgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050975.560227-333-223723093883994/AnsiballZ_stat.py'
Nov 25 06:09:35 compute-0 sudo[101493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:35 compute-0 python3.9[101495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:35 compute-0 sudo[101493]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:36 compute-0 sudo[101616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mefdejfbkhinyequabewngwiktfaxdtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050975.560227-333-223723093883994/AnsiballZ_copy.py'
Nov 25 06:09:36 compute-0 sudo[101616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:36 compute-0 python3.9[101618]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764050975.560227-333-223723093883994/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:36 compute-0 sudo[101616]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:36 compute-0 ovn_controller[95135]: 2025-11-25T06:09:36Z|00037|memory|INFO|16512 kB peak resident set size after 29.9 seconds
Nov 25 06:09:36 compute-0 ovn_controller[95135]: 2025-11-25T06:09:36Z|00038|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Nov 25 06:09:36 compute-0 podman[101619]: 2025-11-25 06:09:36.349903517 +0000 UTC m=+0.061204736 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 06:09:36 compute-0 sudo[101791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cddjsmkcrbpcrpkimglmttmgdfqznkng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050976.5430424-350-198364934096719/AnsiballZ_file.py'
Nov 25 06:09:36 compute-0 sudo[101791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:36 compute-0 python3.9[101793]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:09:36 compute-0 sudo[101791]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:37 compute-0 sudo[101943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ongvjpdevbqgunbpadncbownqwyuykus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050977.0294197-358-56560141049826/AnsiballZ_stat.py'
Nov 25 06:09:37 compute-0 sudo[101943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:37 compute-0 python3.9[101945]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:09:37 compute-0 sudo[101943]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:37 compute-0 sudo[102066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hprkcmeoqassoyizirimokoaipkgdnub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050977.0294197-358-56560141049826/AnsiballZ_copy.py'
Nov 25 06:09:37 compute-0 sudo[102066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:37 compute-0 python3.9[102068]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764050977.0294197-358-56560141049826/.source.json _original_basename=.lbmkj245 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:37 compute-0 sudo[102066]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:38 compute-0 sudo[102218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlwdjzindpdunyhftlhwkciopymgzif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050977.8690176-373-93147364515905/AnsiballZ_file.py'
Nov 25 06:09:38 compute-0 sudo[102218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:38 compute-0 python3.9[102220]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:38 compute-0 sudo[102218]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:38 compute-0 sudo[102370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfrtubbzmkpjugdvfnpcovatovknqiec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050978.3598258-381-7219866436837/AnsiballZ_stat.py'
Nov 25 06:09:38 compute-0 sudo[102370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:38 compute-0 sudo[102370]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:38 compute-0 sudo[102493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmamohnlpgnuygqvsqnsyzmkrnvnupqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050978.3598258-381-7219866436837/AnsiballZ_copy.py'
Nov 25 06:09:38 compute-0 sudo[102493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:39 compute-0 sudo[102493]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:39 compute-0 sudo[102645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqhwqjnrkoefcmtfznicfnklqzrnivmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050979.298698-398-223008433771093/AnsiballZ_container_config_data.py'
Nov 25 06:09:39 compute-0 sudo[102645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:39 compute-0 python3.9[102647]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 25 06:09:39 compute-0 sudo[102645]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:40 compute-0 sudo[102797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znxakdmkoioqbglzovkztxugbhgazamt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050979.955434-407-177611779278979/AnsiballZ_container_config_hash.py'
Nov 25 06:09:40 compute-0 sudo[102797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:40 compute-0 python3.9[102799]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:09:40 compute-0 sudo[102797]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:40 compute-0 sudo[102949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osrzawlyqujmgqdmuzfveovvntgnmlwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050980.5910907-416-84297446085102/AnsiballZ_podman_container_info.py'
Nov 25 06:09:40 compute-0 sudo[102949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:41 compute-0 python3.9[102951]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 06:09:41 compute-0 sudo[102949]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:41 compute-0 sudo[103120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qglzgepjxdzsbkppwrfzaldbjnbtbrtl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764050981.5007603-429-100664930923477/AnsiballZ_edpm_container_manage.py'
Nov 25 06:09:41 compute-0 sudo[103120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:42 compute-0 python3[103122]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:09:42 compute-0 podman[103150]: 2025-11-25 06:09:42.163251267 +0000 UTC m=+0.031030299 container create 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 25 06:09:42 compute-0 podman[103150]: 2025-11-25 06:09:42.148226874 +0000 UTC m=+0.016005916 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:09:42 compute-0 python3[103122]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:09:42 compute-0 sudo[103120]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:42 compute-0 sudo[103326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysvkmhnhwtqizmqdtvoihgdrnslkykbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050982.3735557-437-66861842070036/AnsiballZ_stat.py'
Nov 25 06:09:42 compute-0 sudo[103326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:42 compute-0 python3.9[103328]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:09:42 compute-0 sudo[103326]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:43 compute-0 sudo[103480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-herkfstmrkunwvlqlgoslvklptuextzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050982.9110196-446-200290564691490/AnsiballZ_file.py'
Nov 25 06:09:43 compute-0 sudo[103480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:43 compute-0 python3.9[103482]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:43 compute-0 sudo[103480]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:43 compute-0 sudo[103556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xplzypxoqpmsralfdoevlhlpbycmeoqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050982.9110196-446-200290564691490/AnsiballZ_stat.py'
Nov 25 06:09:43 compute-0 sudo[103556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:43 compute-0 python3.9[103558]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:09:43 compute-0 sudo[103556]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:43 compute-0 sudo[103707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbaobgkbwsylazdeugcymjdpvglytzxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050983.582849-446-169953144485760/AnsiballZ_copy.py'
Nov 25 06:09:43 compute-0 sudo[103707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:44 compute-0 python3.9[103709]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764050983.582849-446-169953144485760/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:09:44 compute-0 sudo[103707]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:44 compute-0 sudo[103783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwvewxphkqndajgyfnsznszrqrvqsscg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050983.582849-446-169953144485760/AnsiballZ_systemd.py'
Nov 25 06:09:44 compute-0 sudo[103783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:44 compute-0 python3.9[103785]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:09:44 compute-0 systemd[1]: Reloading.
Nov 25 06:09:44 compute-0 systemd-sysv-generator[103812]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:44 compute-0 systemd-rc-local-generator[103808]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:44 compute-0 sudo[103783]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:44 compute-0 sudo[103894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfutyucopgdafdarqxgrhxghfktgvlpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050983.582849-446-169953144485760/AnsiballZ_systemd.py'
Nov 25 06:09:44 compute-0 sudo[103894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:45 compute-0 python3.9[103896]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:45 compute-0 systemd[1]: Reloading.
Nov 25 06:09:45 compute-0 systemd-rc-local-generator[103919]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:45 compute-0 systemd-sysv-generator[103922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:45 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 25 06:09:45 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:09:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43564ec3296cd8814dce0a623e2904aaf15099e06671ce0715310c1864cabd41/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 25 06:09:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43564ec3296cd8814dce0a623e2904aaf15099e06671ce0715310c1864cabd41/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:09:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62.
Nov 25 06:09:45 compute-0 podman[103936]: 2025-11-25 06:09:45.322261699 +0000 UTC m=+0.075554998 container init 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + sudo -E kolla_set_configs
Nov 25 06:09:45 compute-0 podman[103936]: 2025-11-25 06:09:45.342423278 +0000 UTC m=+0.095716557 container start 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 25 06:09:45 compute-0 edpm-start-podman-container[103936]: ovn_metadata_agent
Nov 25 06:09:45 compute-0 edpm-start-podman-container[103935]: Creating additional drop-in dependency for "ovn_metadata_agent" (62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62)
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Validating config file
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Copying service configuration files
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Writing out command to execute
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: ++ cat /run_command
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + CMD=neutron-ovn-metadata-agent
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + ARGS=
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + sudo kolla_copy_cacerts
Nov 25 06:09:45 compute-0 systemd[1]: Reloading.
Nov 25 06:09:45 compute-0 podman[103955]: 2025-11-25 06:09:45.394878698 +0000 UTC m=+0.042192733 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + [[ ! -n '' ]]
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + . kolla_extend_start
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: Running command: 'neutron-ovn-metadata-agent'
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + umask 0022
Nov 25 06:09:45 compute-0 ovn_metadata_agent[103948]: + exec neutron-ovn-metadata-agent
Nov 25 06:09:45 compute-0 systemd-rc-local-generator[104013]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:45 compute-0 systemd-sysv-generator[104016]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:45 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 25 06:09:45 compute-0 sudo[103894]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:45 compute-0 sshd-session[95736]: Connection closed by 192.168.122.30 port 57140
Nov 25 06:09:45 compute-0 sshd-session[95733]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:09:45 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Nov 25 06:09:45 compute-0 systemd[1]: session-21.scope: Consumed 23.526s CPU time.
Nov 25 06:09:45 compute-0 systemd-logind[744]: Session 21 logged out. Waiting for processes to exit.
Nov 25 06:09:45 compute-0 systemd-logind[744]: Removed session 21.
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.776 103953 INFO neutron.common.config [-] Logging enabled!
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.776 103953 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev143
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.776 103953 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:124
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.776 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.777 103953 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.778 103953 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.779 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.780 103953 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 192.168.26.115 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.781 103953 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.782 103953 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.783 103953 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.784 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.785 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.786 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.787 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.788 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.789 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.790 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.791 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.792 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.793 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.794 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.795 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.796 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.797 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.798 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.799 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.800 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.801 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.802 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.802 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.802 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.802 103953 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.802 103953 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.809 103953 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.810 103953 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.810 103953 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.810 103953 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.810 103953 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.819 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name afd6e104-36fa-47e5-ae59-019941e8d117 (UUID: afd6e104-36fa-47e5-ae59-019941e8d117) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:419
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.840 103953 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.841 103953 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.841 103953 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.841 103953 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.841 103953 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.843 103953 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.846 103953 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.850 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'afd6e104-36fa-47e5-ae59-019941e8d117'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], external_ids={}, name=afd6e104-36fa-47e5-ae59-019941e8d117, nb_cfg_timestamp=1764050955478, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:09:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:46.851 103953 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpnc9b353r/privsep.sock']
Nov 25 06:09:47 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.348 103953 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.349 103953 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpnc9b353r/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:366
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.278 104066 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.281 104066 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.283 104066 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.283 104066 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104066
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.350 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb5509d-bf32-4b53-8a20-1ca8b2c1b806]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.740 104066 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.740 104066 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:09:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:47.740 104066 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:09:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:48.078 104066 INFO oslo_service.backend [-] Loading backend: eventlet
Nov 25 06:09:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:48.082 104066 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Nov 25 06:09:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:48.112 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[9c41d587-b9df-4747-9f1e-c158e1d6e0eb]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:09:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:48.114 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, column=external_ids, values=({'neutron:ovn-metadata-id': 'a863ead2-7163-5364-8693-8940bd3caadf'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:09:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:09:48.119 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:09:50 compute-0 sshd-session[104073]: Accepted publickey for zuul from 192.168.122.30 port 54614 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:09:50 compute-0 systemd-logind[744]: New session 22 of user zuul.
Nov 25 06:09:50 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 25 06:09:50 compute-0 sshd-session[104073]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:09:51 compute-0 python3.9[104226]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:09:52 compute-0 sudo[104380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajggvbanoyaalwzjnwbbfpffuyqebmry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050991.910304-34-109014900569036/AnsiballZ_command.py'
Nov 25 06:09:52 compute-0 sudo[104380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:52 compute-0 python3.9[104382]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:09:52 compute-0 sudo[104380]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:53 compute-0 sudo[104541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvnfysruwwhnmboigcjsgwbgkxxsuowq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050992.622422-45-212913453216338/AnsiballZ_systemd_service.py'
Nov 25 06:09:53 compute-0 sudo[104541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:53 compute-0 python3.9[104543]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:09:53 compute-0 systemd[1]: Reloading.
Nov 25 06:09:53 compute-0 systemd-rc-local-generator[104569]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:09:53 compute-0 systemd-sysv-generator[104572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:09:53 compute-0 sudo[104541]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:54 compute-0 python3.9[104728]: ansible-ansible.builtin.service_facts Invoked
Nov 25 06:09:54 compute-0 network[104745]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 06:09:54 compute-0 network[104746]: 'network-scripts' will be removed from distribution in near future.
Nov 25 06:09:54 compute-0 network[104747]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 06:09:56 compute-0 sudo[105006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbwzkzpwffnajetabrnhgpjlzydygvuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050995.8976996-64-126390129944634/AnsiballZ_systemd_service.py'
Nov 25 06:09:56 compute-0 sudo[105006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:56 compute-0 python3.9[105008]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:56 compute-0 sudo[105006]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:56 compute-0 sudo[105159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aubfmwovhikyuldiyfbjwzztheasxeom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050996.455194-64-100593335472022/AnsiballZ_systemd_service.py'
Nov 25 06:09:56 compute-0 sudo[105159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:56 compute-0 python3.9[105161]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:56 compute-0 sudo[105159]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:57 compute-0 sudo[105312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxhouliqgkyhecbmfiqystvvwglfubha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050997.0120957-64-273072764773578/AnsiballZ_systemd_service.py'
Nov 25 06:09:57 compute-0 sudo[105312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:57 compute-0 python3.9[105314]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:57 compute-0 sudo[105312]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:57 compute-0 sudo[105465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjmryhmmvowqqoifywdxhuftvusekzqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050997.5507371-64-192169148004169/AnsiballZ_systemd_service.py'
Nov 25 06:09:57 compute-0 sudo[105465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:57 compute-0 python3.9[105467]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:58 compute-0 sudo[105465]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:58 compute-0 sudo[105618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldisdhodwkzjfrnohoqzzbnznjfbklgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050998.1022592-64-229767340804283/AnsiballZ_systemd_service.py'
Nov 25 06:09:58 compute-0 sudo[105618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:58 compute-0 python3.9[105620]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:58 compute-0 sudo[105618]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:58 compute-0 sudo[105771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lphsotegvxnkndmcpnqbepxfkvmzkuic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050998.6478229-64-89110829280601/AnsiballZ_systemd_service.py'
Nov 25 06:09:58 compute-0 sudo[105771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:59 compute-0 python3.9[105773]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:59 compute-0 sudo[105771]: pam_unix(sudo:session): session closed for user root
Nov 25 06:09:59 compute-0 sudo[105924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utrgmhptrnassgnoshqiownmntdiormt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050999.2018428-64-73137432231757/AnsiballZ_systemd_service.py'
Nov 25 06:09:59 compute-0 sudo[105924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:09:59 compute-0 python3.9[105926]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:09:59 compute-0 sudo[105924]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:00 compute-0 sudo[106077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mexscyhgfztdlftjfzobltxzbaygbkrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764050999.907417-116-91956543435656/AnsiballZ_file.py'
Nov 25 06:10:00 compute-0 sudo[106077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:00 compute-0 python3.9[106079]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:00 compute-0 sudo[106077]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:00 compute-0 sudo[106229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkuexqbwcepdyesobwmmpkhtqqfmvgyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051000.4590144-116-89076586593391/AnsiballZ_file.py'
Nov 25 06:10:00 compute-0 sudo[106229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:00 compute-0 python3.9[106231]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:00 compute-0 sudo[106229]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:01 compute-0 sudo[106381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blgiwqblnykguzwgiqphacyngtasyunh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051000.8731804-116-10709552132752/AnsiballZ_file.py'
Nov 25 06:10:01 compute-0 sudo[106381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:01 compute-0 python3.9[106383]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:01 compute-0 sudo[106381]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:01 compute-0 sudo[106533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsqarjwxqhsaypnjdcqumpjmatyqhifw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051001.2928202-116-174026240499274/AnsiballZ_file.py'
Nov 25 06:10:01 compute-0 sudo[106533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:01 compute-0 python3.9[106535]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:01 compute-0 sudo[106533]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:01 compute-0 sudo[106685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyxyxepgtzhthdbgzuluqqyyudvnpymi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051001.710568-116-122518575411751/AnsiballZ_file.py'
Nov 25 06:10:01 compute-0 sudo[106685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:02 compute-0 python3.9[106687]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:02 compute-0 sudo[106685]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:02 compute-0 sudo[106837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrnlcwdydcsdofbqesnlgtpnyuuibrum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051002.1400566-116-266256957422468/AnsiballZ_file.py'
Nov 25 06:10:02 compute-0 sudo[106837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:02 compute-0 python3.9[106839]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:02 compute-0 sudo[106837]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:02 compute-0 sudo[106989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkrbbfffdmtibvznuqptxnjwxfeugswh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051002.5623522-116-29437258432623/AnsiballZ_file.py'
Nov 25 06:10:02 compute-0 sudo[106989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:02 compute-0 python3.9[106991]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:02 compute-0 sudo[106989]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:03 compute-0 sudo[107141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ompjfnbugkyxvqapertyehehuixxgido ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051003.0122411-166-235819353387766/AnsiballZ_file.py'
Nov 25 06:10:03 compute-0 sudo[107141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:03 compute-0 python3.9[107143]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:03 compute-0 sudo[107141]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:03 compute-0 sudo[107293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhgwulzzsdjfbvyfcmvhnwmnbdikopnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051003.4454694-166-180420650812693/AnsiballZ_file.py'
Nov 25 06:10:03 compute-0 sudo[107293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:03 compute-0 python3.9[107295]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:03 compute-0 sudo[107293]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:04 compute-0 sudo[107445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnxwwdosulvhgxelmkzrmkctctdkzcin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051003.8787303-166-34703971380815/AnsiballZ_file.py'
Nov 25 06:10:04 compute-0 sudo[107445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:04 compute-0 python3.9[107447]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:04 compute-0 sudo[107445]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:04 compute-0 sudo[107597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czrpeihldciqxrtluloyljgtrgswdhyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051004.303451-166-36915277426129/AnsiballZ_file.py'
Nov 25 06:10:04 compute-0 sudo[107597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:04 compute-0 python3.9[107599]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:04 compute-0 sudo[107597]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:04 compute-0 sudo[107749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsfojopefkvsfmmqlxowkdtgtvsuuyhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051004.7247663-166-100203167865016/AnsiballZ_file.py'
Nov 25 06:10:04 compute-0 sudo[107749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:05 compute-0 python3.9[107751]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:05 compute-0 sudo[107749]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:05 compute-0 sudo[107901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmjpowkczviulqagusulrglsxfgrcpma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051005.1437175-166-160391364828974/AnsiballZ_file.py'
Nov 25 06:10:05 compute-0 sudo[107901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:05 compute-0 python3.9[107903]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:05 compute-0 sudo[107901]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:05 compute-0 sudo[108053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tptybukaqglfvrddgufrngjyuequjggc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051005.5630746-166-40622731538930/AnsiballZ_file.py'
Nov 25 06:10:05 compute-0 sudo[108053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:05 compute-0 python3.9[108055]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:10:05 compute-0 sudo[108053]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:06 compute-0 sudo[108205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thdizpmylbuecpplfbxbxabemcdsovgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051006.0668914-217-32375875148971/AnsiballZ_command.py'
Nov 25 06:10:06 compute-0 sudo[108205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:06 compute-0 python3.9[108207]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:06 compute-0 sudo[108205]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:06 compute-0 podman[108210]: 2025-11-25 06:10:06.483000705 +0000 UTC m=+0.064471082 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 25 06:10:06 compute-0 python3.9[108383]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 06:10:07 compute-0 sudo[108533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-delcxiridebknsvmdxnlqdbiblmrmvii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051007.1885931-235-26837079050776/AnsiballZ_systemd_service.py'
Nov 25 06:10:07 compute-0 sudo[108533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:07 compute-0 python3.9[108535]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:10:07 compute-0 systemd[1]: Reloading.
Nov 25 06:10:07 compute-0 systemd-rc-local-generator[108558]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:10:07 compute-0 systemd-sysv-generator[108561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:10:07 compute-0 sudo[108533]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:08 compute-0 sudo[108720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpsbigizufhxqlsczcxqmgnpxwwzusfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051007.9162688-243-113536358785355/AnsiballZ_command.py'
Nov 25 06:10:08 compute-0 sudo[108720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:08 compute-0 python3.9[108722]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:08 compute-0 sudo[108720]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:08 compute-0 sudo[108873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nclffawqqpbjmlqvvexpxehcjiqskzye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051008.3543115-243-95040802896593/AnsiballZ_command.py'
Nov 25 06:10:08 compute-0 sudo[108873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:08 compute-0 python3.9[108875]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:08 compute-0 sudo[108873]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:08 compute-0 sudo[109026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hborxxnvfvnjdsfsbmlcerkoziisedrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051008.7830417-243-140939086636542/AnsiballZ_command.py'
Nov 25 06:10:08 compute-0 sudo[109026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:09 compute-0 python3.9[109028]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:09 compute-0 sudo[109026]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:09 compute-0 sudo[109179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fytyjozsvzmgohyinajpzjsadzrnnflv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051009.2151513-243-140385640054084/AnsiballZ_command.py'
Nov 25 06:10:09 compute-0 sudo[109179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:09 compute-0 python3.9[109181]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:09 compute-0 sudo[109179]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:09 compute-0 sudo[109332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrgizdonvpxhdejnebwgogxjnmmbejgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051009.664385-243-81274771507367/AnsiballZ_command.py'
Nov 25 06:10:09 compute-0 sudo[109332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:09 compute-0 python3.9[109334]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:10 compute-0 sudo[109332]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:10 compute-0 sudo[109485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtfsrnpcapzslyeetaixmqrgsvlxlnsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051010.1018891-243-232716486730348/AnsiballZ_command.py'
Nov 25 06:10:10 compute-0 sudo[109485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:10 compute-0 python3.9[109487]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:10 compute-0 sudo[109485]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:10 compute-0 sudo[109638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewtiqphkeursufmjqvljyaphjyemyoey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051010.5338109-243-146584140379266/AnsiballZ_command.py'
Nov 25 06:10:10 compute-0 sudo[109638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:10 compute-0 python3.9[109640]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:10:10 compute-0 sudo[109638]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:11 compute-0 sudo[109791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inbxdjzfefkxdexjdqbtxnukljtidbjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051011.150155-297-228497940767478/AnsiballZ_getent.py'
Nov 25 06:10:11 compute-0 sudo[109791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:11 compute-0 python3.9[109793]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 25 06:10:11 compute-0 sudo[109791]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:12 compute-0 sudo[109944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iawgetsmzdkkukffejayjhbdbgjvpadx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051011.8065667-305-175101177572088/AnsiballZ_group.py'
Nov 25 06:10:12 compute-0 sudo[109944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:12 compute-0 python3.9[109946]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 06:10:12 compute-0 groupadd[109947]: group added to /etc/group: name=libvirt, GID=42473
Nov 25 06:10:12 compute-0 groupadd[109947]: group added to /etc/gshadow: name=libvirt
Nov 25 06:10:12 compute-0 groupadd[109947]: new group: name=libvirt, GID=42473
Nov 25 06:10:12 compute-0 sudo[109944]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:12 compute-0 sudo[110102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqqzonkxuajwabiuhjsjrpkrcjayfemn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051012.4989824-313-191726712732069/AnsiballZ_user.py'
Nov 25 06:10:12 compute-0 sudo[110102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:13 compute-0 python3.9[110104]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 06:10:13 compute-0 useradd[110106]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 06:10:13 compute-0 sudo[110102]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:13 compute-0 sudo[110262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfrrkcppkjorgvclvlnzulhxlafyvyto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051013.3585923-324-152148956772616/AnsiballZ_setup.py'
Nov 25 06:10:13 compute-0 sudo[110262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:13 compute-0 python3.9[110264]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:10:14 compute-0 sudo[110262]: pam_unix(sudo:session): session closed for user root
Nov 25 06:10:14 compute-0 sudo[110346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxyiurpeqrtmcmtbwgcbdbmczudhuzyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051013.3585923-324-152148956772616/AnsiballZ_dnf.py'
Nov 25 06:10:14 compute-0 sudo[110346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:10:14 compute-0 python3.9[110348]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:10:16 compute-0 podman[110353]: 2025-11-25 06:10:16.067877073 +0000 UTC m=+0.041516078 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:10:37 compute-0 podman[110377]: 2025-11-25 06:10:37.078932521 +0000 UTC m=+0.057902150 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 25 06:10:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:10:46.838 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:10:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:10:46.839 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:10:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:10:46.839 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:10:47 compute-0 podman[110401]: 2025-11-25 06:10:47.058293046 +0000 UTC m=+0.038300636 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:11:08 compute-0 podman[110590]: 2025-11-25 06:11:08.099709344 +0000 UTC m=+0.061809905 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:11:16 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 25 06:11:16 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 06:11:16 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 06:11:16 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 06:11:16 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 06:11:16 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 06:11:16 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 06:11:16 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 06:11:18 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 25 06:11:18 compute-0 podman[110629]: 2025-11-25 06:11:18.084020841 +0000 UTC m=+0.045544842 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 06:11:23 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 25 06:11:23 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 06:11:23 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 06:11:23 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 06:11:23 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 06:11:23 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 06:11:23 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 06:11:23 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 06:11:39 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 25 06:11:39 compute-0 podman[116786]: 2025-11-25 06:11:39.096065095 +0000 UTC m=+0.067251535 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 25 06:11:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:11:46.899 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:11:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:11:46.899 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:11:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:11:46.899 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:11:49 compute-0 podman[126637]: 2025-11-25 06:11:49.081826658 +0000 UTC m=+0.060037124 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 06:11:59 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Nov 25 06:11:59 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 06:11:59 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 06:11:59 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 06:11:59 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 06:11:59 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 06:11:59 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 06:11:59 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 06:11:59 compute-0 groupadd[127504]: group added to /etc/group: name=dnsmasq, GID=992
Nov 25 06:11:59 compute-0 groupadd[127504]: group added to /etc/gshadow: name=dnsmasq
Nov 25 06:11:59 compute-0 groupadd[127504]: new group: name=dnsmasq, GID=992
Nov 25 06:11:59 compute-0 useradd[127511]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 25 06:11:59 compute-0 dbus-broker-launch[713]: Noticed file-system modification, trigger reload.
Nov 25 06:11:59 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 25 06:11:59 compute-0 dbus-broker-launch[713]: Noticed file-system modification, trigger reload.
Nov 25 06:12:00 compute-0 groupadd[127524]: group added to /etc/group: name=clevis, GID=991
Nov 25 06:12:00 compute-0 groupadd[127524]: group added to /etc/gshadow: name=clevis
Nov 25 06:12:00 compute-0 groupadd[127524]: new group: name=clevis, GID=991
Nov 25 06:12:00 compute-0 useradd[127531]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 25 06:12:00 compute-0 usermod[127541]: add 'clevis' to group 'tss'
Nov 25 06:12:00 compute-0 usermod[127541]: add 'clevis' to shadow group 'tss'
Nov 25 06:12:02 compute-0 polkitd[43570]: Reloading rules
Nov 25 06:12:02 compute-0 polkitd[43570]: Collecting garbage unconditionally...
Nov 25 06:12:02 compute-0 polkitd[43570]: Loading rules from directory /etc/polkit-1/rules.d
Nov 25 06:12:02 compute-0 polkitd[43570]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 25 06:12:02 compute-0 polkitd[43570]: Finished loading, compiling and executing 3 rules
Nov 25 06:12:02 compute-0 polkitd[43570]: Reloading rules
Nov 25 06:12:02 compute-0 polkitd[43570]: Collecting garbage unconditionally...
Nov 25 06:12:02 compute-0 polkitd[43570]: Loading rules from directory /etc/polkit-1/rules.d
Nov 25 06:12:02 compute-0 polkitd[43570]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 25 06:12:02 compute-0 polkitd[43570]: Finished loading, compiling and executing 3 rules
Nov 25 06:12:02 compute-0 groupadd[127728]: group added to /etc/group: name=ceph, GID=167
Nov 25 06:12:02 compute-0 groupadd[127728]: group added to /etc/gshadow: name=ceph
Nov 25 06:12:02 compute-0 groupadd[127728]: new group: name=ceph, GID=167
Nov 25 06:12:02 compute-0 useradd[127734]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 25 06:12:04 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 25 06:12:04 compute-0 sshd[962]: Received signal 15; terminating.
Nov 25 06:12:04 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 25 06:12:04 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 25 06:12:04 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 25 06:12:04 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 25 06:12:04 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 06:12:04 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 06:12:04 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 06:12:04 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 25 06:12:04 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 25 06:12:04 compute-0 sshd[128253]: Server listening on 0.0.0.0 port 22.
Nov 25 06:12:04 compute-0 sshd[128253]: Server listening on :: port 22.
Nov 25 06:12:04 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 25 06:12:06 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 06:12:06 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 06:12:06 compute-0 systemd[1]: Reloading.
Nov 25 06:12:06 compute-0 systemd-rc-local-generator[128504]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:06 compute-0 systemd-sysv-generator[128509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 06:12:08 compute-0 sudo[110346]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:08 compute-0 sudo[132755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvowdbxmjmvukhrubmosrcczwmyhnebj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051128.2018814-336-170351672211809/AnsiballZ_systemd.py'
Nov 25 06:12:08 compute-0 sudo[132755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:08 compute-0 python3.9[132781]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:12:08 compute-0 systemd[1]: Reloading.
Nov 25 06:12:09 compute-0 systemd-sysv-generator[133276]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:09 compute-0 systemd-rc-local-generator[133270]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:09 compute-0 sudo[132755]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:09 compute-0 podman[133478]: 2025-11-25 06:12:09.272664369 +0000 UTC m=+0.071700583 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:12:09 compute-0 sudo[134049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynfdasrimmwwwgyowohkodakdgbheeyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051129.3137865-336-13627618323612/AnsiballZ_systemd.py'
Nov 25 06:12:09 compute-0 sudo[134049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:09 compute-0 python3.9[134072]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:12:09 compute-0 systemd[1]: Reloading.
Nov 25 06:12:09 compute-0 systemd-rc-local-generator[134577]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:09 compute-0 systemd-sysv-generator[134580]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:10 compute-0 sudo[134049]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:10 compute-0 sudo[135275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ympgldhskyczozilscxduhrzrcfsxtrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051130.1097884-336-232521133599239/AnsiballZ_systemd.py'
Nov 25 06:12:10 compute-0 sudo[135275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:10 compute-0 python3.9[135297]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:12:10 compute-0 systemd[1]: Reloading.
Nov 25 06:12:10 compute-0 systemd-sysv-generator[135802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:10 compute-0 systemd-rc-local-generator[135796]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:10 compute-0 sudo[135275]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:11 compute-0 sudo[136603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbmchfcxjmdacqmvskwtlzofbrfsicek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051130.8921788-336-87975178736826/AnsiballZ_systemd.py'
Nov 25 06:12:11 compute-0 sudo[136603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:11 compute-0 python3.9[136626]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:12:11 compute-0 systemd[1]: Reloading.
Nov 25 06:12:11 compute-0 systemd-rc-local-generator[137109]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:11 compute-0 systemd-sysv-generator[137114]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:11 compute-0 sudo[136603]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 06:12:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 06:12:11 compute-0 systemd[1]: man-db-cache-update.service: Consumed 7.146s CPU time.
Nov 25 06:12:11 compute-0 systemd[1]: run-r653d3c736e634d858dd86a94247ba47b.service: Deactivated successfully.
Nov 25 06:12:11 compute-0 sudo[137832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jifujnkzyvkfkaiyvmmqdiocobxdjhci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051131.72479-365-48775595036504/AnsiballZ_systemd.py'
Nov 25 06:12:11 compute-0 sudo[137832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:12 compute-0 python3.9[137834]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:12 compute-0 systemd[1]: Reloading.
Nov 25 06:12:12 compute-0 systemd-sysv-generator[137861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:12 compute-0 systemd-rc-local-generator[137858]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:12 compute-0 sudo[137832]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:12 compute-0 sudo[138021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgtutpmdebveeucdpcsottyvpjbrhnps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051132.490599-365-235846679119075/AnsiballZ_systemd.py'
Nov 25 06:12:12 compute-0 sudo[138021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:12 compute-0 python3.9[138023]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:12 compute-0 systemd[1]: Reloading.
Nov 25 06:12:13 compute-0 systemd-sysv-generator[138052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:13 compute-0 systemd-rc-local-generator[138049]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:13 compute-0 sudo[138021]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:13 compute-0 sudo[138211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaqwcoujoaxhrhzcstpuobscnbmwaxaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051133.2526567-365-183559653650131/AnsiballZ_systemd.py'
Nov 25 06:12:13 compute-0 sudo[138211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:13 compute-0 python3.9[138213]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:13 compute-0 systemd[1]: Reloading.
Nov 25 06:12:13 compute-0 systemd-rc-local-generator[138237]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:13 compute-0 systemd-sysv-generator[138241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:13 compute-0 sudo[138211]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:14 compute-0 sudo[138400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkdvgytolkedwyavibeedysuuuakcslg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051134.0135567-365-248184014167821/AnsiballZ_systemd.py'
Nov 25 06:12:14 compute-0 sudo[138400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:14 compute-0 python3.9[138402]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:14 compute-0 sudo[138400]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:14 compute-0 sudo[138555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzererbnncvyzcemfqfylbydqelrwtga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051134.5843801-365-17885383113443/AnsiballZ_systemd.py'
Nov 25 06:12:14 compute-0 sudo[138555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:14 compute-0 python3.9[138557]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:15 compute-0 systemd[1]: Reloading.
Nov 25 06:12:15 compute-0 systemd-rc-local-generator[138581]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:15 compute-0 systemd-sysv-generator[138584]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:15 compute-0 sudo[138555]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:15 compute-0 sudo[138745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmammbcnedjdcifawpfrxfhcmvackppe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051135.373447-401-182176506959057/AnsiballZ_systemd.py'
Nov 25 06:12:15 compute-0 sudo[138745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:15 compute-0 python3.9[138747]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 06:12:15 compute-0 systemd[1]: Reloading.
Nov 25 06:12:15 compute-0 systemd-sysv-generator[138776]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:15 compute-0 systemd-rc-local-generator[138773]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:16 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 25 06:12:16 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 25 06:12:16 compute-0 sudo[138745]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:16 compute-0 sudo[138938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eudwizhjjgksyfnhnjzrepzrobkljtuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051136.2029564-409-34123139052637/AnsiballZ_systemd.py'
Nov 25 06:12:16 compute-0 sudo[138938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:16 compute-0 python3.9[138940]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:16 compute-0 sudo[138938]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:16 compute-0 sudo[139093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxgjqilruhaishpqpkwzandoiejxzjlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051136.7763479-409-69416886562130/AnsiballZ_systemd.py'
Nov 25 06:12:16 compute-0 sudo[139093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:17 compute-0 python3.9[139095]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:17 compute-0 sudo[139093]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:17 compute-0 sudo[139248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iefxuslrnnrhcirkjqlkuuawhtqrszqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051137.3379498-409-205279301759143/AnsiballZ_systemd.py'
Nov 25 06:12:17 compute-0 sudo[139248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:17 compute-0 python3.9[139250]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:17 compute-0 sudo[139248]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:18 compute-0 sudo[139403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzskchvfjjsrqugapgijebjtttrsdwow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051137.9028263-409-123775334435071/AnsiballZ_systemd.py'
Nov 25 06:12:18 compute-0 sudo[139403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:18 compute-0 python3.9[139405]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:18 compute-0 sudo[139403]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:18 compute-0 sudo[139558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxjvngesemdenemrveiajexmydtkbuns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051138.4658012-409-96736169307024/AnsiballZ_systemd.py'
Nov 25 06:12:18 compute-0 sudo[139558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:18 compute-0 python3.9[139560]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:18 compute-0 sudo[139558]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:19 compute-0 sudo[139722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccjlvgyeqenjfewzomdjfneumwmqltbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051139.017894-409-202060850209889/AnsiballZ_systemd.py'
Nov 25 06:12:19 compute-0 sudo[139722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:19 compute-0 podman[139687]: 2025-11-25 06:12:19.228966023 +0000 UTC m=+0.045638217 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 06:12:19 compute-0 python3.9[139730]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:19 compute-0 sudo[139722]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:19 compute-0 sudo[139884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xciljubaoqburppebkkgbutuetjyenqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051139.6196399-409-138468285475560/AnsiballZ_systemd.py'
Nov 25 06:12:19 compute-0 sudo[139884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:20 compute-0 python3.9[139886]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:20 compute-0 sudo[139884]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:20 compute-0 sudo[140039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsghpsycryiucmrciyhkuwdjzckiwrgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051140.2100127-409-192227999201453/AnsiballZ_systemd.py'
Nov 25 06:12:20 compute-0 sudo[140039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:20 compute-0 python3.9[140041]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:20 compute-0 sudo[140039]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:20 compute-0 sudo[140194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgeztjfxinpjulpnmxvsbnlplxbgnpcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051140.775557-409-131566005987479/AnsiballZ_systemd.py'
Nov 25 06:12:20 compute-0 sudo[140194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:21 compute-0 python3.9[140196]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:21 compute-0 sudo[140194]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:21 compute-0 sudo[140349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkybuehxiozmijsktnjgvhahyhnctwdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051141.34316-409-149532413255417/AnsiballZ_systemd.py'
Nov 25 06:12:21 compute-0 sudo[140349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:21 compute-0 python3.9[140351]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:21 compute-0 sudo[140349]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:22 compute-0 sudo[140504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kymqqgwfvkcbbwjbgckpryyrmydkqvpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051141.9067044-409-244965594994256/AnsiballZ_systemd.py'
Nov 25 06:12:22 compute-0 sudo[140504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:22 compute-0 python3.9[140506]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:22 compute-0 sudo[140504]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:22 compute-0 sudo[140659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjrfyjekieykmbppieriailugtdynpdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051142.4939997-409-174259015986928/AnsiballZ_systemd.py'
Nov 25 06:12:22 compute-0 sudo[140659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:22 compute-0 python3.9[140661]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:22 compute-0 sudo[140659]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:23 compute-0 sudo[140814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyyhzvambnyxtkraqrgmvzasbvhduoos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051143.0821936-409-21175072139925/AnsiballZ_systemd.py'
Nov 25 06:12:23 compute-0 sudo[140814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:23 compute-0 python3.9[140816]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:23 compute-0 sudo[140814]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:23 compute-0 sudo[140969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjvsahvsmsixbwulhbhqesaancbxdtgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051143.6704192-409-43071418364595/AnsiballZ_systemd.py'
Nov 25 06:12:23 compute-0 sudo[140969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:24 compute-0 python3.9[140971]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 06:12:24 compute-0 sudo[140969]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:24 compute-0 sudo[141124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djscduggnjafwvyutnqhgqmunjofazik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051144.4231663-511-77068007650567/AnsiballZ_file.py'
Nov 25 06:12:24 compute-0 sudo[141124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:24 compute-0 python3.9[141126]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:12:24 compute-0 sudo[141124]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:25 compute-0 sudo[141276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xseyyzvemdkiqasuvisxtdnitachtggr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051144.8688433-511-62837833138246/AnsiballZ_file.py'
Nov 25 06:12:25 compute-0 sudo[141276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:25 compute-0 python3.9[141278]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:12:25 compute-0 sudo[141276]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:25 compute-0 sudo[141428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cixptnsksaucmgnfgvwuddakchiukpdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051145.2940433-511-11373742704153/AnsiballZ_file.py'
Nov 25 06:12:25 compute-0 sudo[141428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:25 compute-0 python3.9[141430]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:12:25 compute-0 sudo[141428]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:25 compute-0 sudo[141580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmqrmqkwcziogvrfcqafotjrtbourosi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051145.7202656-511-213144252049927/AnsiballZ_file.py'
Nov 25 06:12:25 compute-0 sudo[141580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:26 compute-0 python3.9[141582]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:12:26 compute-0 sudo[141580]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:26 compute-0 sudo[141732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydtecxsvweyqexyztoiundrrmtiryulp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051146.1502774-511-182804242589387/AnsiballZ_file.py'
Nov 25 06:12:26 compute-0 sudo[141732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:26 compute-0 python3.9[141734]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:12:26 compute-0 sudo[141732]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:26 compute-0 sudo[141884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgpejpuwwknuyhflzevpcjvpiolpbajb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051146.6837988-511-230299742337428/AnsiballZ_file.py'
Nov 25 06:12:26 compute-0 sudo[141884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:27 compute-0 python3.9[141886]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:12:27 compute-0 sudo[141884]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:27 compute-0 sudo[142036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgmnjlhrvnsonabfaivxjtjihsjefszr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051147.153632-554-226899013908079/AnsiballZ_stat.py'
Nov 25 06:12:27 compute-0 sudo[142036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:27 compute-0 python3.9[142038]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:27 compute-0 sudo[142036]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:27 compute-0 sudo[142161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgylsbpjranzvtlwlbikoexzlzpcwro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051147.153632-554-226899013908079/AnsiballZ_copy.py'
Nov 25 06:12:27 compute-0 sudo[142161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:28 compute-0 python3.9[142163]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051147.153632-554-226899013908079/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:28 compute-0 sudo[142161]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:28 compute-0 sudo[142313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucinzyaigwtlxzoolaufshhvaahbkptv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051148.264845-554-103304908514410/AnsiballZ_stat.py'
Nov 25 06:12:28 compute-0 sudo[142313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:28 compute-0 python3.9[142315]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:28 compute-0 sudo[142313]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:28 compute-0 sudo[142438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvjdzrfnusjaebvvoyvcipuniyitwpol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051148.264845-554-103304908514410/AnsiballZ_copy.py'
Nov 25 06:12:28 compute-0 sudo[142438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:29 compute-0 python3.9[142440]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051148.264845-554-103304908514410/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:29 compute-0 sudo[142438]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:29 compute-0 sudo[142590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puqecnifekkmkugyyexcxgpvkfeapvwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051149.1177015-554-124360092311468/AnsiballZ_stat.py'
Nov 25 06:12:29 compute-0 sudo[142590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:29 compute-0 python3.9[142592]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:29 compute-0 sudo[142590]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:29 compute-0 sudo[142715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhrlkywetpspbllbzsrlxmjbtisbpiow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051149.1177015-554-124360092311468/AnsiballZ_copy.py'
Nov 25 06:12:29 compute-0 sudo[142715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:29 compute-0 python3.9[142717]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051149.1177015-554-124360092311468/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:29 compute-0 sudo[142715]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:30 compute-0 sudo[142867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fotyclqjhlzrplgodaobhjkcrggzovlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051149.9698849-554-199108876641054/AnsiballZ_stat.py'
Nov 25 06:12:30 compute-0 sudo[142867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:30 compute-0 python3.9[142869]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:30 compute-0 sudo[142867]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:30 compute-0 sudo[142992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wazhxhlpevhicivmffklwwyjqedxkbgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051149.9698849-554-199108876641054/AnsiballZ_copy.py'
Nov 25 06:12:30 compute-0 sudo[142992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:30 compute-0 python3.9[142994]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051149.9698849-554-199108876641054/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:30 compute-0 sudo[142992]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:31 compute-0 sudo[143144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtexkwccfjazpkddmmeywywbvqsycdow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051150.8232098-554-273903534270421/AnsiballZ_stat.py'
Nov 25 06:12:31 compute-0 sudo[143144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:31 compute-0 python3.9[143146]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:31 compute-0 sudo[143144]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:31 compute-0 sudo[143269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubhymrbjuszkkbpuenzoetfocchxwyhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051150.8232098-554-273903534270421/AnsiballZ_copy.py'
Nov 25 06:12:31 compute-0 sudo[143269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:31 compute-0 python3.9[143271]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051150.8232098-554-273903534270421/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:31 compute-0 sudo[143269]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:31 compute-0 sudo[143421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytabfyddnsdwpfruabsmngfwfxmqhbxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051151.7292826-554-73740503314165/AnsiballZ_stat.py'
Nov 25 06:12:31 compute-0 sudo[143421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:32 compute-0 python3.9[143423]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:32 compute-0 sudo[143421]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:32 compute-0 sudo[143546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owjxhvaxpgopqcbrliptwoydmpyqvggy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051151.7292826-554-73740503314165/AnsiballZ_copy.py'
Nov 25 06:12:32 compute-0 sudo[143546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:32 compute-0 python3.9[143548]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051151.7292826-554-73740503314165/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:32 compute-0 sudo[143546]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:32 compute-0 sudo[143698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubmelbifkcjwyudkmnlnftdwbijefvpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051152.604186-554-229030584271520/AnsiballZ_stat.py'
Nov 25 06:12:32 compute-0 sudo[143698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:32 compute-0 python3.9[143700]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:32 compute-0 sudo[143698]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:33 compute-0 sudo[143821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqsnvetmojaipkcexvorxbfngcggaiit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051152.604186-554-229030584271520/AnsiballZ_copy.py'
Nov 25 06:12:33 compute-0 sudo[143821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:33 compute-0 python3.9[143823]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051152.604186-554-229030584271520/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:33 compute-0 sudo[143821]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:33 compute-0 sudo[143973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-firdntqsjynlzcagjaurvwfwyvnmvoyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051153.4475493-554-71842657808118/AnsiballZ_stat.py'
Nov 25 06:12:33 compute-0 sudo[143973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:33 compute-0 python3.9[143975]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:33 compute-0 sudo[143973]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:34 compute-0 sudo[144098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgqadccwnhdgkedgdtxgujtzmkafvjfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051153.4475493-554-71842657808118/AnsiballZ_copy.py'
Nov 25 06:12:34 compute-0 sudo[144098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:34 compute-0 python3.9[144100]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764051153.4475493-554-71842657808118/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:34 compute-0 sudo[144098]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:34 compute-0 sudo[144250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agwfddkalmieeunxqaevbhpvmphmtxbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051154.3383784-667-18262720285748/AnsiballZ_command.py'
Nov 25 06:12:34 compute-0 sudo[144250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:34 compute-0 python3.9[144252]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 25 06:12:34 compute-0 sudo[144250]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:35 compute-0 sudo[144403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nczcioovbmipsouphansmpxqoxegfkxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051154.8505073-676-747648818029/AnsiballZ_file.py'
Nov 25 06:12:35 compute-0 sudo[144403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:35 compute-0 python3.9[144405]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:35 compute-0 sudo[144403]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:35 compute-0 sudo[144555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohicahyiybsyqztfptmrwjqersouunzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051155.3330207-676-48998582590244/AnsiballZ_file.py'
Nov 25 06:12:35 compute-0 sudo[144555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:35 compute-0 python3.9[144557]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:35 compute-0 sudo[144555]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:35 compute-0 sudo[144707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvajkcgkgnpjlhevpmybdflwycswbpvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051155.7785482-676-272891839329935/AnsiballZ_file.py'
Nov 25 06:12:35 compute-0 sudo[144707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:36 compute-0 python3.9[144709]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:36 compute-0 sudo[144707]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:36 compute-0 sudo[144859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhjyufojtncrjglsqdzoxikgjcdwroar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051156.211936-676-198387653618193/AnsiballZ_file.py'
Nov 25 06:12:36 compute-0 sudo[144859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:36 compute-0 python3.9[144861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:36 compute-0 sudo[144859]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:36 compute-0 sudo[145011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byzfthykzimebcgqwclqptcntvvugole ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051156.6496346-676-176922411849932/AnsiballZ_file.py'
Nov 25 06:12:36 compute-0 sudo[145011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:36 compute-0 python3.9[145013]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:36 compute-0 sudo[145011]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:37 compute-0 sudo[145163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luiovtjnefkfwxqkykrdtnkufqpcpcls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051157.0869932-676-56156345188261/AnsiballZ_file.py'
Nov 25 06:12:37 compute-0 sudo[145163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:37 compute-0 python3.9[145165]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:37 compute-0 sudo[145163]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:37 compute-0 sudo[145315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypvjpfbvatgcgcgojatfrikiuuozsfet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051157.5167434-676-216593869396748/AnsiballZ_file.py'
Nov 25 06:12:37 compute-0 sudo[145315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:37 compute-0 python3.9[145317]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:37 compute-0 sudo[145315]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:38 compute-0 sudo[145467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kngehuzoqfutsiqcsuqawoayilccglqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051157.9467762-676-64675180299298/AnsiballZ_file.py'
Nov 25 06:12:38 compute-0 sudo[145467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:38 compute-0 python3.9[145469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:38 compute-0 sudo[145467]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:38 compute-0 sudo[145619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqsdyaibxdnmesmotikvtgppkibexwck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051158.3817546-676-184452964534795/AnsiballZ_file.py'
Nov 25 06:12:38 compute-0 sudo[145619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:38 compute-0 python3.9[145621]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:38 compute-0 sudo[145619]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:38 compute-0 sudo[145771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjithyselplcbhpivqpjhzxlnunktyip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051158.8139088-676-98529923857757/AnsiballZ_file.py'
Nov 25 06:12:38 compute-0 sudo[145771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:39 compute-0 python3.9[145773]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:39 compute-0 sudo[145771]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:39 compute-0 sudo[145932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scnzlfyvnlitrpfnbpsypwzxklaigylx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051159.248396-676-83023197577105/AnsiballZ_file.py'
Nov 25 06:12:39 compute-0 sudo[145932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:39 compute-0 podman[145897]: 2025-11-25 06:12:39.497351727 +0000 UTC m=+0.084347688 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:12:39 compute-0 python3.9[145941]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:39 compute-0 sudo[145932]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:39 compute-0 sudo[146098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxjddwmsxqvqdavswhrjpefbsgbjkilw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051159.777452-676-72842783973612/AnsiballZ_file.py'
Nov 25 06:12:39 compute-0 sudo[146098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:40 compute-0 python3.9[146100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:40 compute-0 sudo[146098]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:40 compute-0 sudo[146250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufilfznjqtlqtsdpkskeplcklglcharz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051160.2599423-676-58325957167947/AnsiballZ_file.py'
Nov 25 06:12:40 compute-0 sudo[146250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:40 compute-0 python3.9[146252]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:40 compute-0 sudo[146250]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:40 compute-0 sudo[146402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmfpskofwmbmrmissvzbwboqdafhachj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051160.712155-676-84652855715855/AnsiballZ_file.py'
Nov 25 06:12:40 compute-0 sudo[146402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:41 compute-0 python3.9[146404]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:41 compute-0 sudo[146402]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:41 compute-0 sudo[146554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aafdhtglbkslwrqvrahdrhheqatnsphn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051161.1994581-775-216558594405201/AnsiballZ_stat.py'
Nov 25 06:12:41 compute-0 sudo[146554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:41 compute-0 python3.9[146556]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:41 compute-0 sudo[146554]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:41 compute-0 sudo[146677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywpdnsgclbcoqajbzaewlceceishrita ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051161.1994581-775-216558594405201/AnsiballZ_copy.py'
Nov 25 06:12:41 compute-0 sudo[146677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:41 compute-0 python3.9[146679]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051161.1994581-775-216558594405201/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:41 compute-0 sudo[146677]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:42 compute-0 sudo[146829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aryhthmyrwygywhnaipxdzogfefplstv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051162.046376-775-229738862060286/AnsiballZ_stat.py'
Nov 25 06:12:42 compute-0 sudo[146829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:42 compute-0 python3.9[146831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:42 compute-0 sudo[146829]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:42 compute-0 sudo[146952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohohvgwqnhzwfuijobugjpfeyrqjhbzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051162.046376-775-229738862060286/AnsiballZ_copy.py'
Nov 25 06:12:42 compute-0 sudo[146952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:42 compute-0 python3.9[146954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051162.046376-775-229738862060286/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:42 compute-0 sudo[146952]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:43 compute-0 sudo[147104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cybmvvoypmuqpwhilknndahueawiumyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051162.8674793-775-41713101161415/AnsiballZ_stat.py'
Nov 25 06:12:43 compute-0 sudo[147104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:43 compute-0 python3.9[147106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:43 compute-0 sudo[147104]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:43 compute-0 sudo[147227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llxbuutvmiekjydkqxhbrikyuvxlnnwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051162.8674793-775-41713101161415/AnsiballZ_copy.py'
Nov 25 06:12:43 compute-0 sudo[147227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:43 compute-0 python3.9[147229]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051162.8674793-775-41713101161415/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:43 compute-0 sudo[147227]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:43 compute-0 sudo[147379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twykytgiksistjzuyjwhbereonnpmlll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051163.6930735-775-103898287625167/AnsiballZ_stat.py'
Nov 25 06:12:43 compute-0 sudo[147379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:44 compute-0 python3.9[147381]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:44 compute-0 sudo[147379]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:44 compute-0 sudo[147502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ollhjclemwnblsvycotiwetwpjqwxgdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051163.6930735-775-103898287625167/AnsiballZ_copy.py'
Nov 25 06:12:44 compute-0 sudo[147502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:44 compute-0 python3.9[147504]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051163.6930735-775-103898287625167/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:44 compute-0 sudo[147502]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:44 compute-0 sudo[147654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vktnrvuycmfzsdgyhlxtcmawckhmjflb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051164.5783103-775-257419377643587/AnsiballZ_stat.py'
Nov 25 06:12:44 compute-0 sudo[147654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:44 compute-0 python3.9[147656]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:44 compute-0 sudo[147654]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:45 compute-0 sudo[147777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiyttozwpveazagukmnumidvloxiupbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051164.5783103-775-257419377643587/AnsiballZ_copy.py'
Nov 25 06:12:45 compute-0 sudo[147777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:45 compute-0 python3.9[147779]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051164.5783103-775-257419377643587/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:45 compute-0 sudo[147777]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:45 compute-0 sudo[147929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfubqvoehrumrygpbsijxttkfnthhiks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051165.391406-775-269328604360503/AnsiballZ_stat.py'
Nov 25 06:12:45 compute-0 sudo[147929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:45 compute-0 python3.9[147931]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:45 compute-0 sudo[147929]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:45 compute-0 sudo[148052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmdylzqwmtqlyfhzaczlsefsbcaqgtld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051165.391406-775-269328604360503/AnsiballZ_copy.py'
Nov 25 06:12:45 compute-0 sudo[148052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:46 compute-0 python3.9[148054]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051165.391406-775-269328604360503/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:46 compute-0 sudo[148052]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:46 compute-0 sudo[148204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlurnvftehrrakqdeshyacqmqtywjxhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051166.198-775-178960560565859/AnsiballZ_stat.py'
Nov 25 06:12:46 compute-0 sudo[148204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:46 compute-0 python3.9[148206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:46 compute-0 sudo[148204]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:46 compute-0 sudo[148327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srfeenzgqhxccvejpgjfxupgwsvmcybz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051166.198-775-178960560565859/AnsiballZ_copy.py'
Nov 25 06:12:46 compute-0 sudo[148327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:46 compute-0 python3.9[148329]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051166.198-775-178960560565859/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:46 compute-0 sudo[148327]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:12:46.959 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:12:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:12:46.959 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:12:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:12:46.959 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:12:47 compute-0 sudo[148480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmixtgqcuxpqzrzcnefyjeayxwsxmgvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051167.0072947-775-4965144826556/AnsiballZ_stat.py'
Nov 25 06:12:47 compute-0 sudo[148480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:47 compute-0 python3.9[148482]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:47 compute-0 sudo[148480]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:47 compute-0 sudo[148603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uowlwtmylybmesefapbitkarhqzxsnme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051167.0072947-775-4965144826556/AnsiballZ_copy.py'
Nov 25 06:12:47 compute-0 sudo[148603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:47 compute-0 python3.9[148605]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051167.0072947-775-4965144826556/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:47 compute-0 sudo[148603]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:47 compute-0 sudo[148755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtvstibzijmjxpvpstbmdoebgumklspy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051167.793342-775-263911594879308/AnsiballZ_stat.py'
Nov 25 06:12:47 compute-0 sudo[148755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:48 compute-0 python3.9[148757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:48 compute-0 sudo[148755]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:48 compute-0 sudo[148878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yemtmqxvvjuhtbmjuohjvmwrknyghehb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051167.793342-775-263911594879308/AnsiballZ_copy.py'
Nov 25 06:12:48 compute-0 sudo[148878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:48 compute-0 python3.9[148880]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051167.793342-775-263911594879308/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:48 compute-0 sudo[148878]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:48 compute-0 sudo[149030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxoatgxncgihsfshkoxdhwelfrgsklhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051168.6855724-775-33737755093698/AnsiballZ_stat.py'
Nov 25 06:12:48 compute-0 sudo[149030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:48 compute-0 python3.9[149032]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:49 compute-0 sudo[149030]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:49 compute-0 sudo[149153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdyrpirhxdmorbijnybmszjuzehmimyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051168.6855724-775-33737755093698/AnsiballZ_copy.py'
Nov 25 06:12:49 compute-0 sudo[149153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:49 compute-0 podman[149156]: 2025-11-25 06:12:49.288889472 +0000 UTC m=+0.035534011 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 06:12:49 compute-0 python3.9[149155]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051168.6855724-775-33737755093698/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:49 compute-0 sudo[149153]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:49 compute-0 sudo[149321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egmjzahdcpgfehdraptgzytcccbrhyln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051169.4950986-775-151090599932065/AnsiballZ_stat.py'
Nov 25 06:12:49 compute-0 sudo[149321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:49 compute-0 python3.9[149323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:49 compute-0 sudo[149321]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:50 compute-0 sudo[149444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cedzdiveyoontnjkhhbznqhhsgfbljjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051169.4950986-775-151090599932065/AnsiballZ_copy.py'
Nov 25 06:12:50 compute-0 sudo[149444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:50 compute-0 python3.9[149446]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051169.4950986-775-151090599932065/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:50 compute-0 sudo[149444]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:50 compute-0 sudo[149596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swtcxflnvdmqosbrxsoqivtyhxsxccmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051170.299004-775-25928330334696/AnsiballZ_stat.py'
Nov 25 06:12:50 compute-0 sudo[149596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:50 compute-0 python3.9[149598]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:50 compute-0 sudo[149596]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:50 compute-0 sudo[149719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdtdupvwuhabutejilgdagyjmqdxsduf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051170.299004-775-25928330334696/AnsiballZ_copy.py'
Nov 25 06:12:50 compute-0 sudo[149719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:50 compute-0 python3.9[149721]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051170.299004-775-25928330334696/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:51 compute-0 sudo[149719]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:51 compute-0 sudo[149871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvqfzcludatojcdgpfmxdestixwuhneq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051171.1113493-775-260021364003364/AnsiballZ_stat.py'
Nov 25 06:12:51 compute-0 sudo[149871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:51 compute-0 python3.9[149873]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:51 compute-0 sudo[149871]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:51 compute-0 sudo[149994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbmnrnrrvulcfkgfxmwgiglfsvoynnry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051171.1113493-775-260021364003364/AnsiballZ_copy.py'
Nov 25 06:12:51 compute-0 sudo[149994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:51 compute-0 python3.9[149996]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051171.1113493-775-260021364003364/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:51 compute-0 sudo[149994]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:52 compute-0 sudo[150146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obsalbmemnlrujvgkepuhwtdljvtpiwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051171.9273665-775-143940971375676/AnsiballZ_stat.py'
Nov 25 06:12:52 compute-0 sudo[150146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:52 compute-0 python3.9[150148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:12:52 compute-0 sudo[150146]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:52 compute-0 sudo[150269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngeaqjkyxeidisibzwgogvxrnsrcvhgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051171.9273665-775-143940971375676/AnsiballZ_copy.py'
Nov 25 06:12:52 compute-0 sudo[150269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:52 compute-0 python3.9[150271]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051171.9273665-775-143940971375676/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:52 compute-0 sudo[150269]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:53 compute-0 python3.9[150421]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:12:53 compute-0 sudo[150574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqnutnhgurywpyrccwgctpzcfexhncyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051173.221427-981-109259331461403/AnsiballZ_seboolean.py'
Nov 25 06:12:53 compute-0 sudo[150574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:53 compute-0 python3.9[150576]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 25 06:12:54 compute-0 sudo[150574]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:54 compute-0 sudo[150730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkkbvsafuceeavdtofwrwnzlkvrknlmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051174.5460634-989-55378811269592/AnsiballZ_copy.py'
Nov 25 06:12:54 compute-0 dbus-broker-launch[734]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 25 06:12:54 compute-0 sudo[150730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:54 compute-0 python3.9[150732]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:54 compute-0 sudo[150730]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:55 compute-0 sudo[150882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwsscstpozcelcjvhziohzsiywnsjoov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051174.9703174-989-214782265841312/AnsiballZ_copy.py'
Nov 25 06:12:55 compute-0 sudo[150882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:55 compute-0 python3.9[150884]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:55 compute-0 sudo[150882]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:55 compute-0 sudo[151034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyscuuwptsadmidboxnbnunwfjnhaexw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051175.3903008-989-199468250874051/AnsiballZ_copy.py'
Nov 25 06:12:55 compute-0 sudo[151034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:55 compute-0 python3.9[151036]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:55 compute-0 sudo[151034]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:55 compute-0 sudo[151186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qusioqmckssubjikucqiyxtbqrdbcrgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051175.804727-989-177807794221251/AnsiballZ_copy.py'
Nov 25 06:12:55 compute-0 sudo[151186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:56 compute-0 python3.9[151188]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:56 compute-0 sudo[151186]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:56 compute-0 sudo[151338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkugylngdiemeoqzynzzrynszqjdhlyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051176.217735-989-184474008305894/AnsiballZ_copy.py'
Nov 25 06:12:56 compute-0 sudo[151338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:56 compute-0 python3.9[151340]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:56 compute-0 sudo[151338]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:56 compute-0 sudo[151490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trtqdglgzwjopjictwiiswqnosqujxnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051176.68123-1025-260485840511778/AnsiballZ_copy.py'
Nov 25 06:12:56 compute-0 sudo[151490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:57 compute-0 python3.9[151492]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:57 compute-0 sudo[151490]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:57 compute-0 sudo[151642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofwvbvauvakdeyxkmozxlqnypmwtpunz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051177.1213462-1025-273627654557701/AnsiballZ_copy.py'
Nov 25 06:12:57 compute-0 sudo[151642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:57 compute-0 python3.9[151644]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:57 compute-0 sudo[151642]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:57 compute-0 sudo[151794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zshwwowkmqumlwinpgtlrqxgofhanquh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051177.5563052-1025-73177100615185/AnsiballZ_copy.py'
Nov 25 06:12:57 compute-0 sudo[151794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:57 compute-0 python3.9[151796]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:57 compute-0 sudo[151794]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:58 compute-0 sudo[151946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vftrbnkbdiunjceqbqhykuvegytnuwna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051177.9955564-1025-217403573966366/AnsiballZ_copy.py'
Nov 25 06:12:58 compute-0 sudo[151946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:58 compute-0 python3.9[151948]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:58 compute-0 sudo[151946]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:58 compute-0 sudo[152098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhjjxyqljklrhopxudfjbwlstjfubztz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051178.4293501-1025-77788282406641/AnsiballZ_copy.py'
Nov 25 06:12:58 compute-0 sudo[152098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:58 compute-0 python3.9[152100]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:12:58 compute-0 sudo[152098]: pam_unix(sudo:session): session closed for user root
Nov 25 06:12:59 compute-0 sudo[152250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gceojyemsihbcxfrmblceybzphcbssyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051179.0162947-1061-18799604421833/AnsiballZ_systemd.py'
Nov 25 06:12:59 compute-0 sudo[152250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:12:59 compute-0 python3.9[152252]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:12:59 compute-0 systemd[1]: Reloading.
Nov 25 06:12:59 compute-0 systemd-sysv-generator[152276]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:12:59 compute-0 systemd-rc-local-generator[152272]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:12:59 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 25 06:12:59 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 25 06:12:59 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 25 06:12:59 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 25 06:12:59 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 25 06:12:59 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 25 06:12:59 compute-0 sudo[152250]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:00 compute-0 sudo[152442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqsaqukmkoihbrqwiymoukklyqjavxxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051179.8699477-1061-206136375163255/AnsiballZ_systemd.py'
Nov 25 06:13:00 compute-0 sudo[152442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:00 compute-0 python3.9[152444]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:13:00 compute-0 systemd[1]: Reloading.
Nov 25 06:13:00 compute-0 systemd-rc-local-generator[152464]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:00 compute-0 systemd-sysv-generator[152468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:00 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 25 06:13:00 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 25 06:13:00 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 25 06:13:00 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 25 06:13:00 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 25 06:13:00 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 25 06:13:00 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 25 06:13:00 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 25 06:13:00 compute-0 sudo[152442]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:00 compute-0 sudo[152658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udawglnzpexggdwdhupowxshoivrtrft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051180.6759586-1061-17146828134081/AnsiballZ_systemd.py'
Nov 25 06:13:00 compute-0 sudo[152658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:01 compute-0 python3.9[152660]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:13:01 compute-0 systemd[1]: Reloading.
Nov 25 06:13:01 compute-0 systemd-rc-local-generator[152683]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:01 compute-0 systemd-sysv-generator[152687]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:01 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 25 06:13:01 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 25 06:13:01 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 25 06:13:01 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 25 06:13:01 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 25 06:13:01 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 25 06:13:01 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 25 06:13:01 compute-0 sudo[152658]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:01 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 25 06:13:01 compute-0 sudo[152872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urqmewyesnlrgsyydivvfheryfbizjor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051181.4639847-1061-56240676296060/AnsiballZ_systemd.py'
Nov 25 06:13:01 compute-0 sudo[152872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:01 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 25 06:13:01 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 25 06:13:01 compute-0 python3.9[152875]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:13:01 compute-0 systemd[1]: Reloading.
Nov 25 06:13:01 compute-0 systemd-rc-local-generator[152902]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:01 compute-0 systemd-sysv-generator[152908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:02 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 25 06:13:02 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 25 06:13:02 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 25 06:13:02 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 25 06:13:02 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 25 06:13:02 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 25 06:13:02 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 25 06:13:02 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 25 06:13:02 compute-0 sudo[152872]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:02 compute-0 setroubleshoot[152697]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 66a23d0a-1744-4ffe-8be8-029a31b0b47d
Nov 25 06:13:02 compute-0 setroubleshoot[152697]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 25 06:13:02 compute-0 setroubleshoot[152697]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 66a23d0a-1744-4ffe-8be8-029a31b0b47d
Nov 25 06:13:02 compute-0 setroubleshoot[152697]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 25 06:13:02 compute-0 sudo[153095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnucrxzvyykqmowzzujhgimfzczqvaco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051182.3057108-1061-50721376704234/AnsiballZ_systemd.py'
Nov 25 06:13:02 compute-0 sudo[153095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:02 compute-0 python3.9[153097]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:13:02 compute-0 systemd[1]: Reloading.
Nov 25 06:13:02 compute-0 systemd-rc-local-generator[153121]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:02 compute-0 systemd-sysv-generator[153124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 25 06:13:02 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 25 06:13:02 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 25 06:13:02 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 25 06:13:02 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 25 06:13:03 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 25 06:13:03 compute-0 sudo[153095]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:03 compute-0 sudo[153307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcnltlgiobpmsourdrkzcmxhncqwxejn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051183.2060347-1098-30978387622782/AnsiballZ_file.py'
Nov 25 06:13:03 compute-0 sudo[153307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:03 compute-0 python3.9[153309]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:03 compute-0 sudo[153307]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:03 compute-0 sudo[153459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdbsijzuyxhyzbwjytbqbvqwcviubvjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051183.7522416-1106-259695568824780/AnsiballZ_find.py'
Nov 25 06:13:03 compute-0 sudo[153459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:04 compute-0 python3.9[153461]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 06:13:04 compute-0 sudo[153459]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:04 compute-0 sudo[153611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkbkpjqbmvywiavjyxzoxaahkrwuzyht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051184.3601077-1120-202521276330609/AnsiballZ_stat.py'
Nov 25 06:13:04 compute-0 sudo[153611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:04 compute-0 python3.9[153613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:04 compute-0 sudo[153611]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:04 compute-0 sudo[153734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mamddgmhdwgxzapnsnrgeawwjvmnoejw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051184.3601077-1120-202521276330609/AnsiballZ_copy.py'
Nov 25 06:13:04 compute-0 sudo[153734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:05 compute-0 python3.9[153736]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051184.3601077-1120-202521276330609/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:05 compute-0 sudo[153734]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:05 compute-0 sudo[153886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjgovhvzggajfmbdqipskkrowrufyqst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051185.2770407-1136-110816096004280/AnsiballZ_file.py'
Nov 25 06:13:05 compute-0 sudo[153886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:05 compute-0 python3.9[153888]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:05 compute-0 sudo[153886]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:05 compute-0 sudo[154038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqaaalcatntesznwqmhaxmeehhrmuskp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051185.747758-1144-276943292062469/AnsiballZ_stat.py'
Nov 25 06:13:05 compute-0 sudo[154038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:06 compute-0 python3.9[154040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:06 compute-0 sudo[154038]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:06 compute-0 sudo[154116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocemzzpvgotjntxbbwvdmvuwbfpubplp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051185.747758-1144-276943292062469/AnsiballZ_file.py'
Nov 25 06:13:06 compute-0 sudo[154116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:06 compute-0 python3.9[154118]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:06 compute-0 sudo[154116]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:06 compute-0 sudo[154268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwnugsoocpirteuglwwratnxycjvayfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051186.5695822-1156-62723881675542/AnsiballZ_stat.py'
Nov 25 06:13:06 compute-0 sudo[154268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:06 compute-0 python3.9[154270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:06 compute-0 sudo[154268]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:07 compute-0 sudo[154346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilrfwfyskkhnbkadyskwwgdmxchzucts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051186.5695822-1156-62723881675542/AnsiballZ_file.py'
Nov 25 06:13:07 compute-0 sudo[154346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:07 compute-0 python3.9[154348]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fn9pj6nm recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:07 compute-0 sudo[154346]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:07 compute-0 sudo[154498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gavicdcffubnxpjpyryrfcncmevckpug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051187.3854783-1168-183979299363298/AnsiballZ_stat.py'
Nov 25 06:13:07 compute-0 sudo[154498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:07 compute-0 python3.9[154500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:07 compute-0 sudo[154498]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:08 compute-0 sudo[154576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwrrgtppvzhxsfrnshgehgixidufxvjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051187.3854783-1168-183979299363298/AnsiballZ_file.py'
Nov 25 06:13:08 compute-0 sudo[154576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:08 compute-0 python3.9[154578]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:08 compute-0 sudo[154576]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:08 compute-0 sudo[154728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tehsznaxihcaqxmretlifbetqctrfkhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051188.3654606-1181-158870855875165/AnsiballZ_command.py'
Nov 25 06:13:08 compute-0 sudo[154728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:08 compute-0 python3.9[154730]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:13:08 compute-0 sudo[154728]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:09 compute-0 sudo[154881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnrpuxmtaivarsxorftdfxorjvjpnffr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051188.8376002-1189-73575218579806/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 06:13:09 compute-0 sudo[154881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:09 compute-0 python3[154883]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 06:13:09 compute-0 sudo[154881]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:09 compute-0 sudo[155040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyviqhhbsixwxwhkonemozvcxxsewjts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051189.4338686-1197-121417864607405/AnsiballZ_stat.py'
Nov 25 06:13:09 compute-0 sudo[155040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:09 compute-0 podman[155007]: 2025-11-25 06:13:09.716014144 +0000 UTC m=+0.092661138 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:13:09 compute-0 python3.9[155051]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:09 compute-0 sudo[155040]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:09 compute-0 sudo[155134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiyvfodqwfsqchrxlwevnckngqxieqsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051189.4338686-1197-121417864607405/AnsiballZ_file.py'
Nov 25 06:13:09 compute-0 sudo[155134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:10 compute-0 python3.9[155136]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:10 compute-0 sudo[155134]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:10 compute-0 sudo[155286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtjoptmhhtnfprzaqzglaahffjlirvsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051190.2771661-1209-165738650359509/AnsiballZ_stat.py'
Nov 25 06:13:10 compute-0 sudo[155286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:10 compute-0 python3.9[155288]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:10 compute-0 sudo[155286]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:10 compute-0 sudo[155364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njtcipywpqteaepvyamlghwapzuginlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051190.2771661-1209-165738650359509/AnsiballZ_file.py'
Nov 25 06:13:10 compute-0 sudo[155364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:10 compute-0 python3.9[155366]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:10 compute-0 sudo[155364]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:11 compute-0 sudo[155516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgmczrdqizyvgcjcztgbrolxfbceomot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051191.0762079-1221-249632086560578/AnsiballZ_stat.py'
Nov 25 06:13:11 compute-0 sudo[155516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:11 compute-0 python3.9[155518]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:11 compute-0 sudo[155516]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:11 compute-0 sudo[155594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdbgyiktzdpassuosccesfnvtakqrkar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051191.0762079-1221-249632086560578/AnsiballZ_file.py'
Nov 25 06:13:11 compute-0 sudo[155594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:11 compute-0 python3.9[155596]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:11 compute-0 sudo[155594]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:12 compute-0 sudo[155746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztopzorbszypukspajftapduzjqfwwof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051191.8803945-1233-13921612427521/AnsiballZ_stat.py'
Nov 25 06:13:12 compute-0 sudo[155746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:12 compute-0 python3.9[155748]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:12 compute-0 sudo[155746]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:12 compute-0 sudo[155824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnrofhloxjjndmhvbfecbthhrlamjutr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051191.8803945-1233-13921612427521/AnsiballZ_file.py'
Nov 25 06:13:12 compute-0 sudo[155824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:12 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 25 06:13:12 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 25 06:13:12 compute-0 python3.9[155826]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:12 compute-0 sudo[155824]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:12 compute-0 sudo[155976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cltrogczhtsqwlwmnengegfztwpywhgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051192.698358-1245-89617888834530/AnsiballZ_stat.py'
Nov 25 06:13:12 compute-0 sudo[155976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:13 compute-0 python3.9[155978]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:13 compute-0 sudo[155976]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:13 compute-0 sudo[156101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysofmouamdywcdtnpnuscgdwzgmhakop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051192.698358-1245-89617888834530/AnsiballZ_copy.py'
Nov 25 06:13:13 compute-0 sudo[156101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:13 compute-0 python3.9[156103]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051192.698358-1245-89617888834530/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:13 compute-0 sudo[156101]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:13 compute-0 sudo[156253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrbhqmfmmlimgqakomddmsfvpbcztuim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051193.5939498-1260-37672584593462/AnsiballZ_file.py'
Nov 25 06:13:13 compute-0 sudo[156253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:13 compute-0 python3.9[156255]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:13 compute-0 sudo[156253]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:14 compute-0 sudo[156405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcccmpbkmmnknnpaoubqdjmaviikwgao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051194.0468698-1268-83445532761861/AnsiballZ_command.py'
Nov 25 06:13:14 compute-0 sudo[156405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:14 compute-0 python3.9[156407]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:13:14 compute-0 sudo[156405]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:14 compute-0 sudo[156560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvivqdiwkddeqaiwlprwtrlvebxuotwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051194.5163615-1276-271439312055765/AnsiballZ_blockinfile.py'
Nov 25 06:13:14 compute-0 sudo[156560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:14 compute-0 python3.9[156562]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:14 compute-0 sudo[156560]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:15 compute-0 sudo[156712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwsnidvipopobrhzduftclteeqnadeds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051195.1112082-1285-88849888899390/AnsiballZ_command.py'
Nov 25 06:13:15 compute-0 sudo[156712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:15 compute-0 python3.9[156714]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:13:15 compute-0 sudo[156712]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:15 compute-0 sudo[156865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxhovmihgmhcnqcliugtvtgnfxpugsqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051195.5740416-1293-245642779105288/AnsiballZ_stat.py'
Nov 25 06:13:15 compute-0 sudo[156865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:15 compute-0 python3.9[156867]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:13:15 compute-0 sudo[156865]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:16 compute-0 sudo[157019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucqhqvizrukqfsxfksyynelsgkiyorle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051196.0459282-1301-180816622746974/AnsiballZ_command.py'
Nov 25 06:13:16 compute-0 sudo[157019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:16 compute-0 python3.9[157021]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:13:16 compute-0 sudo[157019]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:16 compute-0 sudo[157174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quphavfhkpbewejkvtgfpsjxengenunb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051196.5598755-1309-68916011312767/AnsiballZ_file.py'
Nov 25 06:13:16 compute-0 sudo[157174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:16 compute-0 python3.9[157176]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:16 compute-0 sudo[157174]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:17 compute-0 sudo[157326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yauqimwtjqsfpyhqgatnwtdcppdqtgfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051197.0534246-1317-220496028405088/AnsiballZ_stat.py'
Nov 25 06:13:17 compute-0 sudo[157326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:17 compute-0 python3.9[157328]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:17 compute-0 sudo[157326]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:17 compute-0 sudo[157449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxybykizipinmnkhiciyxiotyyillbyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051197.0534246-1317-220496028405088/AnsiballZ_copy.py'
Nov 25 06:13:17 compute-0 sudo[157449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:17 compute-0 python3.9[157451]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051197.0534246-1317-220496028405088/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:17 compute-0 sudo[157449]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:18 compute-0 sudo[157601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrcdoptkxeldxklabjsmysuilizvfbdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051197.8762987-1332-76791024677793/AnsiballZ_stat.py'
Nov 25 06:13:18 compute-0 sudo[157601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:18 compute-0 python3.9[157603]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:18 compute-0 sudo[157601]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:18 compute-0 sudo[157724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsilkrvvfafgmlibqbzrhirlhcalomvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051197.8762987-1332-76791024677793/AnsiballZ_copy.py'
Nov 25 06:13:18 compute-0 sudo[157724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:18 compute-0 python3.9[157726]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051197.8762987-1332-76791024677793/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:18 compute-0 sudo[157724]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:18 compute-0 sudo[157876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyirfucjyqpcaxdnwozaetwxyqxfsxvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051198.689562-1347-172594444689840/AnsiballZ_stat.py'
Nov 25 06:13:18 compute-0 sudo[157876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:19 compute-0 python3.9[157878]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:19 compute-0 sudo[157876]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:19 compute-0 sudo[157999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faohlfxdhsitbrepwknoxqcdiwbtvrmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051198.689562-1347-172594444689840/AnsiballZ_copy.py'
Nov 25 06:13:19 compute-0 sudo[157999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:19 compute-0 python3.9[158001]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051198.689562-1347-172594444689840/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:19 compute-0 sudo[157999]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:19 compute-0 sudo[158163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etewhzwuhwryrfcfnkmmlnrltocfqmmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051199.5353084-1362-56101758564965/AnsiballZ_systemd.py'
Nov 25 06:13:19 compute-0 sudo[158163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:19 compute-0 podman[158125]: 2025-11-25 06:13:19.734074435 +0000 UTC m=+0.041306128 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:13:19 compute-0 python3.9[158169]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:13:19 compute-0 systemd[1]: Reloading.
Nov 25 06:13:20 compute-0 systemd-rc-local-generator[158190]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:20 compute-0 systemd-sysv-generator[158193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:20 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 25 06:13:20 compute-0 sudo[158163]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:20 compute-0 sudo[158359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgyfujujprjcbvexwjtgmtijgtihpeqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051200.3642762-1370-272104961349464/AnsiballZ_systemd.py'
Nov 25 06:13:20 compute-0 sudo[158359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:20 compute-0 python3.9[158361]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 25 06:13:20 compute-0 systemd[1]: Reloading.
Nov 25 06:13:20 compute-0 systemd-rc-local-generator[158381]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:20 compute-0 systemd-sysv-generator[158385]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:21 compute-0 systemd[1]: Reloading.
Nov 25 06:13:21 compute-0 systemd-rc-local-generator[158418]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:21 compute-0 systemd-sysv-generator[158421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:21 compute-0 sudo[158359]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:21 compute-0 sshd-session[104076]: Connection closed by 192.168.122.30 port 54614
Nov 25 06:13:21 compute-0 sshd-session[104073]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:13:21 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 25 06:13:21 compute-0 systemd[1]: session-22.scope: Consumed 2min 20.051s CPU time.
Nov 25 06:13:21 compute-0 systemd-logind[744]: Session 22 logged out. Waiting for processes to exit.
Nov 25 06:13:21 compute-0 systemd-logind[744]: Removed session 22.
Nov 25 06:13:26 compute-0 sshd-session[158458]: Accepted publickey for zuul from 192.168.122.30 port 53262 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:13:26 compute-0 systemd-logind[744]: New session 23 of user zuul.
Nov 25 06:13:26 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 25 06:13:26 compute-0 sshd-session[158458]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:13:27 compute-0 python3.9[158611]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:13:28 compute-0 python3.9[158765]: ansible-ansible.builtin.service_facts Invoked
Nov 25 06:13:28 compute-0 network[158782]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 06:13:28 compute-0 network[158783]: 'network-scripts' will be removed from distribution in near future.
Nov 25 06:13:28 compute-0 network[158784]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 06:13:30 compute-0 sudo[159053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spnrenplnwrjxvbuwbnrbuuzzdfnmghl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051210.7100396-47-272147301238213/AnsiballZ_setup.py'
Nov 25 06:13:30 compute-0 sudo[159053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:31 compute-0 python3.9[159055]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 06:13:31 compute-0 sudo[159053]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:31 compute-0 sudo[159137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbmibpeahxroadnfogvdbmnnagykskzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051210.7100396-47-272147301238213/AnsiballZ_dnf.py'
Nov 25 06:13:31 compute-0 sudo[159137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:31 compute-0 python3.9[159139]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:13:36 compute-0 sudo[159137]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:36 compute-0 sudo[159290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssattotgesmsyplphjsxrerosszkonwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051216.1467617-59-73055124265512/AnsiballZ_stat.py'
Nov 25 06:13:36 compute-0 sudo[159290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:36 compute-0 python3.9[159292]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:13:36 compute-0 sudo[159290]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:37 compute-0 sudo[159442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izbdcomvzjxejxjchsgvjljtkyrdhers ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051216.758083-69-114383348015057/AnsiballZ_command.py'
Nov 25 06:13:37 compute-0 sudo[159442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:37 compute-0 python3.9[159444]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:13:37 compute-0 sudo[159442]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:37 compute-0 sudo[159595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpnevjimdyztlzjyqirihsidqomflrpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051217.3902285-79-66161550535289/AnsiballZ_stat.py'
Nov 25 06:13:37 compute-0 sudo[159595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:37 compute-0 python3.9[159597]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:13:37 compute-0 sudo[159595]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:37 compute-0 sudo[159747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jizzeuhxhmqcvrlinzoffpzoybqmfqjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051217.825727-87-35345373704655/AnsiballZ_command.py'
Nov 25 06:13:37 compute-0 sudo[159747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:38 compute-0 python3.9[159749]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:13:38 compute-0 sudo[159747]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:38 compute-0 sudo[159900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcfozqaagpmfzicqkkobehvtazamjzur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051218.2677104-95-140666976316661/AnsiballZ_stat.py'
Nov 25 06:13:38 compute-0 sudo[159900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:38 compute-0 python3.9[159902]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:38 compute-0 sudo[159900]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:38 compute-0 sudo[160023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfqegthwenkjxbyysjxivdfqexuwzcbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051218.2677104-95-140666976316661/AnsiballZ_copy.py'
Nov 25 06:13:38 compute-0 sudo[160023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:39 compute-0 python3.9[160025]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051218.2677104-95-140666976316661/.source.iscsi _original_basename=.yabh8tbq follow=False checksum=d272977c61e0e41a3a7fd79b109970736bf3689d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:39 compute-0 sudo[160023]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:39 compute-0 sudo[160175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkfcpkczjkchgtchutaeioauefovzanh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051219.2080567-110-40590680165377/AnsiballZ_file.py'
Nov 25 06:13:39 compute-0 sudo[160175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:39 compute-0 python3.9[160177]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:39 compute-0 sudo[160175]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:40 compute-0 podman[160254]: 2025-11-25 06:13:40.076543573 +0000 UTC m=+0.053641424 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 25 06:13:40 compute-0 sudo[160350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhgfvcsnyfayxpiueqpiajwswvinptye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051219.886786-118-140078813975255/AnsiballZ_lineinfile.py'
Nov 25 06:13:40 compute-0 sudo[160350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:40 compute-0 python3.9[160352]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:40 compute-0 sudo[160350]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:40 compute-0 sudo[160502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bniclkmecofujrkocthytrbrdykqsalv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051220.4940784-127-100814393272288/AnsiballZ_systemd_service.py'
Nov 25 06:13:40 compute-0 sudo[160502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:41 compute-0 python3.9[160504]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:13:41 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 25 06:13:41 compute-0 sudo[160502]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:41 compute-0 sudo[160658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iledalljnosducdkmyokmwuozlqmbhsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051221.296004-135-175703994248017/AnsiballZ_systemd_service.py'
Nov 25 06:13:41 compute-0 sudo[160658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:41 compute-0 python3.9[160660]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:13:41 compute-0 systemd[1]: Reloading.
Nov 25 06:13:41 compute-0 systemd-sysv-generator[160687]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:41 compute-0 systemd-rc-local-generator[160683]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:41 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 25 06:13:41 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 25 06:13:41 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 25 06:13:42 compute-0 systemd[1]: Started Open-iSCSI.
Nov 25 06:13:42 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 25 06:13:42 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 25 06:13:42 compute-0 sudo[160658]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:42 compute-0 sudo[160858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpzgddrpsjzjotxdwhzjsulyvhrtjwet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051222.2810357-146-244177723705481/AnsiballZ_service_facts.py'
Nov 25 06:13:42 compute-0 sudo[160858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:42 compute-0 python3.9[160860]: ansible-ansible.builtin.service_facts Invoked
Nov 25 06:13:42 compute-0 network[160877]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 06:13:42 compute-0 network[160878]: 'network-scripts' will be removed from distribution in near future.
Nov 25 06:13:42 compute-0 network[160879]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 06:13:44 compute-0 sudo[160858]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:44 compute-0 sudo[161148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixmaqnvvugnpibbdxmcalmhlgihjebgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051224.7861109-156-262734580203679/AnsiballZ_file.py'
Nov 25 06:13:44 compute-0 sudo[161148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:45 compute-0 python3.9[161150]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 06:13:45 compute-0 sudo[161148]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:45 compute-0 sudo[161300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhsxfuwtugriaspldmypdltrtwdbvthm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051225.247707-164-272242059247897/AnsiballZ_modprobe.py'
Nov 25 06:13:45 compute-0 sudo[161300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:45 compute-0 python3.9[161302]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 25 06:13:45 compute-0 sudo[161300]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:46 compute-0 sudo[161456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnyoxzvxlrvngnyxihzgzbtwoejgescz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051225.82526-172-212736222633573/AnsiballZ_stat.py'
Nov 25 06:13:46 compute-0 sudo[161456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:46 compute-0 python3.9[161458]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:46 compute-0 sudo[161456]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:46 compute-0 sudo[161579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzgurhtdgiiblygjtesggwhcfvygvjvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051225.82526-172-212736222633573/AnsiballZ_copy.py'
Nov 25 06:13:46 compute-0 sudo[161579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:46 compute-0 python3.9[161581]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051225.82526-172-212736222633573/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:46 compute-0 sudo[161579]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:46 compute-0 sudo[161731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugytuzkvtznnhmnvamuolmqmcuoroyon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051226.7041988-188-99586368811659/AnsiballZ_lineinfile.py'
Nov 25 06:13:46 compute-0 sudo[161731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:13:47.007 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:13:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:13:47.008 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:13:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:13:47.008 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:13:47 compute-0 python3.9[161733]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:47 compute-0 sudo[161731]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:47 compute-0 sudo[161884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mylsrtkyixmflbqnzcftvyleopnppwvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051227.1575642-196-154402478050065/AnsiballZ_systemd.py'
Nov 25 06:13:47 compute-0 sudo[161884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:47 compute-0 python3.9[161886]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:13:47 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 25 06:13:47 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 25 06:13:47 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 25 06:13:47 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 25 06:13:47 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 25 06:13:47 compute-0 sudo[161884]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:48 compute-0 sudo[162040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdptxgegcoqvuklpuhebckcbwxaskwki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051228.0253847-204-156840999570736/AnsiballZ_file.py'
Nov 25 06:13:48 compute-0 sudo[162040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:48 compute-0 python3.9[162042]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:13:48 compute-0 sudo[162040]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:48 compute-0 sudo[162192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwdkccwecotpewfqgooirjtknfoonbtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051228.543417-213-26695970206348/AnsiballZ_stat.py'
Nov 25 06:13:48 compute-0 sudo[162192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:48 compute-0 python3.9[162194]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:13:48 compute-0 sudo[162192]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:49 compute-0 sudo[162344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukekzuenmgcosrvporxgwqbnwkbovuma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051229.01772-222-70502854954427/AnsiballZ_stat.py'
Nov 25 06:13:49 compute-0 sudo[162344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:49 compute-0 python3.9[162346]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:13:49 compute-0 sudo[162344]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:49 compute-0 sudo[162496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srbzwjemkvovesqidyyyskprqlcgecno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051229.4589553-230-134928916712681/AnsiballZ_stat.py'
Nov 25 06:13:49 compute-0 sudo[162496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:49 compute-0 python3.9[162498]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:49 compute-0 sudo[162496]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:50 compute-0 sudo[162628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqsvtyasexkiowgwsmbevurhvsrarcfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051229.4589553-230-134928916712681/AnsiballZ_copy.py'
Nov 25 06:13:50 compute-0 sudo[162628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:50 compute-0 podman[162593]: 2025-11-25 06:13:50.014928418 +0000 UTC m=+0.036701985 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:13:50 compute-0 python3.9[162638]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051229.4589553-230-134928916712681/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:50 compute-0 sudo[162628]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:50 compute-0 sudo[162789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erymyjxyzcmsccnhgfxwzkkwxflfbymj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051230.2866259-245-273416092745373/AnsiballZ_command.py'
Nov 25 06:13:50 compute-0 sudo[162789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:50 compute-0 python3.9[162791]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:13:50 compute-0 sudo[162789]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:50 compute-0 sudo[162942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuxktabuivntlturwprhzqezltpzzijh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051230.7144988-253-131481530850670/AnsiballZ_lineinfile.py'
Nov 25 06:13:50 compute-0 sudo[162942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:51 compute-0 python3.9[162944]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:51 compute-0 sudo[162942]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:51 compute-0 sudo[163094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uegcvavoqwalcrqabnurzaugzpwpgmlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051231.1430607-261-37929504303820/AnsiballZ_replace.py'
Nov 25 06:13:51 compute-0 sudo[163094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:51 compute-0 python3.9[163096]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:51 compute-0 sudo[163094]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:51 compute-0 sudo[163246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jomreewkkxntotlzxzfrpeoudsalbiqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051231.7371817-269-266173747177122/AnsiballZ_replace.py'
Nov 25 06:13:51 compute-0 sudo[163246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:52 compute-0 python3.9[163248]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:52 compute-0 sudo[163246]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:52 compute-0 sudo[163398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjngsqsbrgwvbcamuezhkfphqxevbgno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051232.196581-278-116073562482319/AnsiballZ_lineinfile.py'
Nov 25 06:13:52 compute-0 sudo[163398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:52 compute-0 python3.9[163400]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:52 compute-0 sudo[163398]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:52 compute-0 sudo[163550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfyskskckqovlbahkumwzkhqvfrpcffi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051232.6166315-278-197799664419088/AnsiballZ_lineinfile.py'
Nov 25 06:13:52 compute-0 sudo[163550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:52 compute-0 python3.9[163552]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:52 compute-0 sudo[163550]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:53 compute-0 sudo[163702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuvzkijmthxumhtdzbmxhvkfdtibbqdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051233.030597-278-232392212719060/AnsiballZ_lineinfile.py'
Nov 25 06:13:53 compute-0 sudo[163702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:53 compute-0 python3.9[163704]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:53 compute-0 sudo[163702]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:53 compute-0 sudo[163854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjdbdzfyhmodxdvpqopcgfiolxmbfqog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051233.51641-278-266096471645468/AnsiballZ_lineinfile.py'
Nov 25 06:13:53 compute-0 sudo[163854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:53 compute-0 python3.9[163856]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:53 compute-0 sudo[163854]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:54 compute-0 sudo[164006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gldxzzjxbkgzigcsdynbjwzzvgpppuou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051233.9499385-307-252926338606216/AnsiballZ_stat.py'
Nov 25 06:13:54 compute-0 sudo[164006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:54 compute-0 python3.9[164008]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:13:54 compute-0 sudo[164006]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:54 compute-0 sudo[164160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzvigleizzjopmtfvqbxqpxvazthfztm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051234.3983934-315-95125925225382/AnsiballZ_file.py'
Nov 25 06:13:54 compute-0 sudo[164160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:54 compute-0 python3.9[164162]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:54 compute-0 sudo[164160]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:55 compute-0 sudo[164312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isnxpkkrwdbbgcxzjphoxqhrjvpvhxbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051234.895621-324-264984003906/AnsiballZ_file.py'
Nov 25 06:13:55 compute-0 sudo[164312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:55 compute-0 python3.9[164314]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:13:55 compute-0 sudo[164312]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:55 compute-0 sudo[164464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qosphixscuhyraoeqcfuloamoullfluj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051235.3426497-332-118493916615935/AnsiballZ_stat.py'
Nov 25 06:13:55 compute-0 sudo[164464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:55 compute-0 python3.9[164466]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:55 compute-0 sudo[164464]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:55 compute-0 sudo[164542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yatxswtztzxvpmlfhkzlbkcvbosdnwfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051235.3426497-332-118493916615935/AnsiballZ_file.py'
Nov 25 06:13:55 compute-0 sudo[164542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:56 compute-0 python3.9[164544]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:13:56 compute-0 sudo[164542]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:56 compute-0 sudo[164694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qafyosvxmujcrukhbmyonuyjjxnbywbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051236.1444032-332-153896347585056/AnsiballZ_stat.py'
Nov 25 06:13:56 compute-0 sudo[164694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:56 compute-0 python3.9[164696]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:56 compute-0 sudo[164694]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:56 compute-0 sudo[164772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukrbmjvdcabjasoobdlybdurphxdrdfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051236.1444032-332-153896347585056/AnsiballZ_file.py'
Nov 25 06:13:56 compute-0 sudo[164772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:56 compute-0 python3.9[164774]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:13:56 compute-0 sudo[164772]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:57 compute-0 sudo[164924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhacgstlqwpoalbahwxrbrocblnvwfkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051236.9260132-355-65516360302485/AnsiballZ_file.py'
Nov 25 06:13:57 compute-0 sudo[164924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:57 compute-0 python3.9[164926]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:57 compute-0 sudo[164924]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:57 compute-0 sudo[165076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxvhurwibsdsilioqbqssbnqgjrejcnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051237.3781133-363-148864774706641/AnsiballZ_stat.py'
Nov 25 06:13:57 compute-0 sudo[165076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:57 compute-0 python3.9[165078]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:57 compute-0 sudo[165076]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:57 compute-0 sudo[165154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddquercicvhvyljgvkgbxcgnsyrlwyjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051237.3781133-363-148864774706641/AnsiballZ_file.py'
Nov 25 06:13:57 compute-0 sudo[165154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:58 compute-0 python3.9[165156]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:58 compute-0 sudo[165154]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:58 compute-0 sudo[165306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keubdjaobyotnuyyraxklfzgdfmssoto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051238.2765934-375-168162263914841/AnsiballZ_stat.py'
Nov 25 06:13:58 compute-0 sudo[165306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:58 compute-0 python3.9[165308]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:13:58 compute-0 sudo[165306]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:58 compute-0 sudo[165384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctdzgnleqsvwokkxicbuwwcervgsigig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051238.2765934-375-168162263914841/AnsiballZ_file.py'
Nov 25 06:13:58 compute-0 sudo[165384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:58 compute-0 python3.9[165386]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:13:58 compute-0 sudo[165384]: pam_unix(sudo:session): session closed for user root
Nov 25 06:13:59 compute-0 sudo[165536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-undvhqgupngmrwefedjsecnlilmpdynv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051239.082362-387-153947821982947/AnsiballZ_systemd.py'
Nov 25 06:13:59 compute-0 sudo[165536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:13:59 compute-0 python3.9[165538]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:13:59 compute-0 systemd[1]: Reloading.
Nov 25 06:13:59 compute-0 systemd-sysv-generator[165563]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:13:59 compute-0 systemd-rc-local-generator[165559]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:13:59 compute-0 sudo[165536]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:00 compute-0 sudo[165726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bltsvrusqoltakzohpkfyuzdpbneezgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051239.8834915-395-270606673091464/AnsiballZ_stat.py'
Nov 25 06:14:00 compute-0 sudo[165726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:00 compute-0 python3.9[165728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:00 compute-0 sudo[165726]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:00 compute-0 sudo[165804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttssjxumciscppcmlcnjohbctjrdqnsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051239.8834915-395-270606673091464/AnsiballZ_file.py'
Nov 25 06:14:00 compute-0 sudo[165804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:00 compute-0 python3.9[165806]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:00 compute-0 sudo[165804]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:00 compute-0 sudo[165956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teprapqufknmmzpfiehxsltdyqbuwcof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051240.6591148-407-88612133347187/AnsiballZ_stat.py'
Nov 25 06:14:00 compute-0 sudo[165956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:00 compute-0 python3.9[165958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:01 compute-0 sudo[165956]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:01 compute-0 sudo[166034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkcldkhssjbmbrrxvimtnkcusbirjjrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051240.6591148-407-88612133347187/AnsiballZ_file.py'
Nov 25 06:14:01 compute-0 sudo[166034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:01 compute-0 python3.9[166036]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:01 compute-0 sudo[166034]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:01 compute-0 sudo[166186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpusdjkdfadtfrdaecnzusqdoyquksqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051241.4074361-419-274117225260380/AnsiballZ_systemd.py'
Nov 25 06:14:01 compute-0 sudo[166186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:01 compute-0 python3.9[166188]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:01 compute-0 systemd[1]: Reloading.
Nov 25 06:14:01 compute-0 systemd-rc-local-generator[166209]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:01 compute-0 systemd-sysv-generator[166212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:02 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 06:14:02 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 06:14:02 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 06:14:02 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 06:14:02 compute-0 sudo[166186]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:02 compute-0 sudo[166379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-madialcqeagftbgncpyhptpzocgelcdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051242.4268515-429-80073588113465/AnsiballZ_file.py'
Nov 25 06:14:02 compute-0 sudo[166379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:02 compute-0 python3.9[166381]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:02 compute-0 sudo[166379]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:03 compute-0 sudo[166531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jscrauphgtpzlwurhrvuxnlpllxxpchd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051242.8810287-437-228118267293509/AnsiballZ_stat.py'
Nov 25 06:14:03 compute-0 sudo[166531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:03 compute-0 python3.9[166533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:03 compute-0 sudo[166531]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:03 compute-0 sudo[166654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hezhrnkvgzqysibtibtsaxfryqieyjqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051242.8810287-437-228118267293509/AnsiballZ_copy.py'
Nov 25 06:14:03 compute-0 sudo[166654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:03 compute-0 python3.9[166656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051242.8810287-437-228118267293509/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:03 compute-0 sudo[166654]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:03 compute-0 sudo[166806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zplfjtlxftmzpvzsnxixysjeprggyyqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051243.8233702-454-59131248104385/AnsiballZ_file.py'
Nov 25 06:14:03 compute-0 sudo[166806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:04 compute-0 python3.9[166808]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:04 compute-0 sudo[166806]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:04 compute-0 sudo[166958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlmswtdhujdvupyvkwkmwahrfrxzomi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051244.2864351-462-184036253752994/AnsiballZ_stat.py'
Nov 25 06:14:04 compute-0 sudo[166958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:04 compute-0 python3.9[166960]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:04 compute-0 sudo[166958]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:04 compute-0 sudo[167081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqlctabghdqjezjnuflvibqhhresesmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051244.2864351-462-184036253752994/AnsiballZ_copy.py'
Nov 25 06:14:04 compute-0 sudo[167081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:04 compute-0 python3.9[167083]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051244.2864351-462-184036253752994/.source.json _original_basename=.vkwps9n6 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:04 compute-0 sudo[167081]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:05 compute-0 sudo[167233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ophnsdruqkfstloigsadxsnenprrczym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051245.1049118-477-97735026944385/AnsiballZ_file.py'
Nov 25 06:14:05 compute-0 sudo[167233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:05 compute-0 python3.9[167235]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:05 compute-0 sudo[167233]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:05 compute-0 sudo[167385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkfzbsckbndrotnlcrnybfylyybkwtro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051245.585513-485-178926968691379/AnsiballZ_stat.py'
Nov 25 06:14:05 compute-0 sudo[167385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:05 compute-0 sudo[167385]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:06 compute-0 sudo[167508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkzdnbwvhazkwhtkdagwsejysgjfyqyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051245.585513-485-178926968691379/AnsiballZ_copy.py'
Nov 25 06:14:06 compute-0 sudo[167508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:06 compute-0 sudo[167508]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:06 compute-0 sudo[167660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nooofhkumgeyghpdlhehjhvantardkdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051246.4858432-502-252901790759245/AnsiballZ_container_config_data.py'
Nov 25 06:14:06 compute-0 sudo[167660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:06 compute-0 python3.9[167662]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 25 06:14:06 compute-0 sudo[167660]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:07 compute-0 sudo[167812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwuaxqyloinqmwranndjryjvxnnuiuxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051247.0847929-511-234954957267383/AnsiballZ_container_config_hash.py'
Nov 25 06:14:07 compute-0 sudo[167812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:07 compute-0 python3.9[167814]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:14:07 compute-0 sudo[167812]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:08 compute-0 sudo[167964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjtlbvayaaevresjsfjngtnxkzlfxvun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051247.739163-520-112974818556002/AnsiballZ_podman_container_info.py'
Nov 25 06:14:08 compute-0 sudo[167964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:08 compute-0 python3.9[167966]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 06:14:08 compute-0 sudo[167964]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:09 compute-0 sudo[168135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnubjhofziznxmcjwchcuvdcfoxnbyfm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051248.7041295-533-242898630374951/AnsiballZ_edpm_container_manage.py'
Nov 25 06:14:09 compute-0 sudo[168135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:09 compute-0 python3[168137]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:14:09 compute-0 podman[168165]: 2025-11-25 06:14:09.414968943 +0000 UTC m=+0.027022735 container create 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:14:09 compute-0 podman[168165]: 2025-11-25 06:14:09.402116124 +0000 UTC m=+0.014169896 image pull 828f38556716c2bbf53d759883b37dd33dbb0b3669db0223d51c04787010a74b quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:14:09 compute-0 python3[168137]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:14:09 compute-0 sudo[168135]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:09 compute-0 sudo[168342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdmqbahxabhrnoeiyegjtnhpgfjydvzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051249.629313-541-40792285251044/AnsiballZ_stat.py'
Nov 25 06:14:09 compute-0 sudo[168342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:09 compute-0 python3.9[168344]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:14:09 compute-0 sudo[168342]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:10 compute-0 sudo[168506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsrpfcmkbyiltolwamrpyswygnylrbrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051250.1361544-550-114636849609726/AnsiballZ_file.py'
Nov 25 06:14:10 compute-0 sudo[168506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:10 compute-0 podman[168470]: 2025-11-25 06:14:10.3465067 +0000 UTC m=+0.059016261 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 25 06:14:10 compute-0 python3.9[168514]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:10 compute-0 sudo[168506]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:10 compute-0 sudo[168595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mspuewhrvufqiwtshhkxbypqprcxvhsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051250.1361544-550-114636849609726/AnsiballZ_stat.py'
Nov 25 06:14:10 compute-0 sudo[168595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:10 compute-0 python3.9[168597]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:14:10 compute-0 sudo[168595]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:11 compute-0 sudo[168746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbjnfgizntxsomjpkwyohltipkkdmwmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051250.834119-550-149238500729386/AnsiballZ_copy.py'
Nov 25 06:14:11 compute-0 sudo[168746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:11 compute-0 python3.9[168748]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764051250.834119-550-149238500729386/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:11 compute-0 sudo[168746]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:11 compute-0 sudo[168822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmnhetmbxizwmcxumpokytnuowvnkyac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051250.834119-550-149238500729386/AnsiballZ_systemd.py'
Nov 25 06:14:11 compute-0 sudo[168822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:11 compute-0 python3.9[168824]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:14:11 compute-0 systemd[1]: Reloading.
Nov 25 06:14:11 compute-0 systemd-rc-local-generator[168845]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:11 compute-0 systemd-sysv-generator[168849]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:11 compute-0 sudo[168822]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:12 compute-0 sudo[168934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjycqqsikvuokwbeoswivcuwdasdsllv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051250.834119-550-149238500729386/AnsiballZ_systemd.py'
Nov 25 06:14:12 compute-0 sudo[168934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:12 compute-0 python3.9[168936]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:12 compute-0 systemd[1]: Reloading.
Nov 25 06:14:12 compute-0 systemd-rc-local-generator[168958]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:12 compute-0 systemd-sysv-generator[168964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:12 compute-0 systemd[1]: Starting multipathd container...
Nov 25 06:14:12 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:14:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e91652ef3a2c05acffda516849840198f870c9972cc7b776ba84d89e5006e756/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 06:14:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e91652ef3a2c05acffda516849840198f870c9972cc7b776ba84d89e5006e756/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 06:14:12 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.
Nov 25 06:14:12 compute-0 podman[168976]: 2025-11-25 06:14:12.671357738 +0000 UTC m=+0.080519537 container init 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 06:14:12 compute-0 multipathd[168988]: + sudo -E kolla_set_configs
Nov 25 06:14:12 compute-0 podman[168976]: 2025-11-25 06:14:12.688842856 +0000 UTC m=+0.098004635 container start 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:14:12 compute-0 podman[168976]: multipathd
Nov 25 06:14:12 compute-0 sudo[168994]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 25 06:14:12 compute-0 sudo[168994]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:14:12 compute-0 sudo[168994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 06:14:12 compute-0 systemd[1]: Started multipathd container.
Nov 25 06:14:12 compute-0 sudo[168934]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:12 compute-0 multipathd[168988]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:14:12 compute-0 multipathd[168988]: INFO:__main__:Validating config file
Nov 25 06:14:12 compute-0 multipathd[168988]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:14:12 compute-0 multipathd[168988]: INFO:__main__:Writing out command to execute
Nov 25 06:14:12 compute-0 sudo[168994]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:12 compute-0 multipathd[168988]: ++ cat /run_command
Nov 25 06:14:12 compute-0 multipathd[168988]: + CMD='/usr/sbin/multipathd -d'
Nov 25 06:14:12 compute-0 multipathd[168988]: + ARGS=
Nov 25 06:14:12 compute-0 multipathd[168988]: + sudo kolla_copy_cacerts
Nov 25 06:14:12 compute-0 sudo[169017]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 25 06:14:12 compute-0 sudo[169017]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:14:12 compute-0 sudo[169017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 06:14:12 compute-0 sudo[169017]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:12 compute-0 multipathd[168988]: Running command: '/usr/sbin/multipathd -d'
Nov 25 06:14:12 compute-0 multipathd[168988]: + [[ ! -n '' ]]
Nov 25 06:14:12 compute-0 multipathd[168988]: + . kolla_extend_start
Nov 25 06:14:12 compute-0 multipathd[168988]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 25 06:14:12 compute-0 multipathd[168988]: + umask 0022
Nov 25 06:14:12 compute-0 multipathd[168988]: + exec /usr/sbin/multipathd -d
Nov 25 06:14:12 compute-0 multipathd[168988]: 2241.211594 | --------start up--------
Nov 25 06:14:12 compute-0 multipathd[168988]: 2241.211604 | read /etc/multipath.conf
Nov 25 06:14:12 compute-0 multipathd[168988]: 2241.215468 | path checkers start up
Nov 25 06:14:12 compute-0 podman[168995]: 2025-11-25 06:14:12.781954848 +0000 UTC m=+0.077120152 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:14:12 compute-0 systemd[1]: 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd-5a690f1ccd148448.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 06:14:12 compute-0 systemd[1]: 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd-5a690f1ccd148448.service: Failed with result 'exit-code'.
Nov 25 06:14:13 compute-0 python3.9[169175]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:14:13 compute-0 sudo[169327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmeeyqmbkcaourbfbzqystuokujzjhgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051253.2968931-586-92134753229115/AnsiballZ_command.py'
Nov 25 06:14:13 compute-0 sudo[169327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:13 compute-0 python3.9[169329]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:13 compute-0 sudo[169327]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:13 compute-0 sudo[169488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnhcqflpflysnzjamkzwccmlwrxzcuox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051253.7827187-594-135362243035191/AnsiballZ_systemd.py'
Nov 25 06:14:13 compute-0 sudo[169488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:14 compute-0 python3.9[169490]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:14:14 compute-0 systemd[1]: Stopping multipathd container...
Nov 25 06:14:14 compute-0 multipathd[168988]: 2242.713358 | exit (signal)
Nov 25 06:14:14 compute-0 multipathd[168988]: 2242.713642 | --------shut down-------
Nov 25 06:14:14 compute-0 systemd[1]: libpod-36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.scope: Deactivated successfully.
Nov 25 06:14:14 compute-0 podman[169494]: 2025-11-25 06:14:14.298834679 +0000 UTC m=+0.053625006 container died 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 25 06:14:14 compute-0 systemd[1]: 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd-5a690f1ccd148448.timer: Deactivated successfully.
Nov 25 06:14:14 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.
Nov 25 06:14:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd-userdata-shm.mount: Deactivated successfully.
Nov 25 06:14:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-e91652ef3a2c05acffda516849840198f870c9972cc7b776ba84d89e5006e756-merged.mount: Deactivated successfully.
Nov 25 06:14:14 compute-0 podman[169494]: 2025-11-25 06:14:14.344045721 +0000 UTC m=+0.098836057 container cleanup 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 06:14:14 compute-0 podman[169494]: multipathd
Nov 25 06:14:14 compute-0 podman[169517]: multipathd
Nov 25 06:14:14 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 25 06:14:14 compute-0 systemd[1]: Stopped multipathd container.
Nov 25 06:14:14 compute-0 systemd[1]: Starting multipathd container...
Nov 25 06:14:14 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:14:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e91652ef3a2c05acffda516849840198f870c9972cc7b776ba84d89e5006e756/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 06:14:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e91652ef3a2c05acffda516849840198f870c9972cc7b776ba84d89e5006e756/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 06:14:14 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.
Nov 25 06:14:14 compute-0 podman[169526]: 2025-11-25 06:14:14.480697577 +0000 UTC m=+0.071479214 container init 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 25 06:14:14 compute-0 multipathd[169539]: + sudo -E kolla_set_configs
Nov 25 06:14:14 compute-0 sudo[169545]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 25 06:14:14 compute-0 sudo[169545]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:14:14 compute-0 sudo[169545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 06:14:14 compute-0 podman[169526]: 2025-11-25 06:14:14.50070716 +0000 UTC m=+0.091488787 container start 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 06:14:14 compute-0 podman[169526]: multipathd
Nov 25 06:14:14 compute-0 systemd[1]: Started multipathd container.
Nov 25 06:14:14 compute-0 sudo[169488]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:14 compute-0 multipathd[169539]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:14:14 compute-0 multipathd[169539]: INFO:__main__:Validating config file
Nov 25 06:14:14 compute-0 multipathd[169539]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:14:14 compute-0 multipathd[169539]: INFO:__main__:Writing out command to execute
Nov 25 06:14:14 compute-0 sudo[169545]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:14 compute-0 multipathd[169539]: ++ cat /run_command
Nov 25 06:14:14 compute-0 multipathd[169539]: + CMD='/usr/sbin/multipathd -d'
Nov 25 06:14:14 compute-0 multipathd[169539]: + ARGS=
Nov 25 06:14:14 compute-0 multipathd[169539]: + sudo kolla_copy_cacerts
Nov 25 06:14:14 compute-0 sudo[169565]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 25 06:14:14 compute-0 sudo[169565]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:14:14 compute-0 sudo[169565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 06:14:14 compute-0 sudo[169565]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:14 compute-0 multipathd[169539]: + [[ ! -n '' ]]
Nov 25 06:14:14 compute-0 multipathd[169539]: + . kolla_extend_start
Nov 25 06:14:14 compute-0 multipathd[169539]: Running command: '/usr/sbin/multipathd -d'
Nov 25 06:14:14 compute-0 multipathd[169539]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 25 06:14:14 compute-0 multipathd[169539]: + umask 0022
Nov 25 06:14:14 compute-0 multipathd[169539]: + exec /usr/sbin/multipathd -d
Nov 25 06:14:14 compute-0 multipathd[169539]: 2243.015448 | --------start up--------
Nov 25 06:14:14 compute-0 multipathd[169539]: 2243.015459 | read /etc/multipath.conf
Nov 25 06:14:14 compute-0 multipathd[169539]: 2243.019188 | path checkers start up
Nov 25 06:14:14 compute-0 podman[169546]: 2025-11-25 06:14:14.577959641 +0000 UTC m=+0.068128555 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:14:14 compute-0 systemd[1]: 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd-7389ac00e25e7bad.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 06:14:14 compute-0 systemd[1]: 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd-7389ac00e25e7bad.service: Failed with result 'exit-code'.
Nov 25 06:14:14 compute-0 sudo[169725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoenlxgyidgzofmnublnmtjqkmaephfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051254.6678808-602-263768446074151/AnsiballZ_file.py'
Nov 25 06:14:14 compute-0 sudo[169725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:15 compute-0 python3.9[169727]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:15 compute-0 sudo[169725]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:15 compute-0 sudo[169877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipellxhovkubnjreynozajdkbvjtsebs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051255.2998283-614-85983642704923/AnsiballZ_file.py'
Nov 25 06:14:15 compute-0 sudo[169877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:15 compute-0 python3.9[169879]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 06:14:15 compute-0 sudo[169877]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:15 compute-0 sudo[170029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thozttasqxwouyvqxkkziixcctonwuwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051255.7504199-622-17410600524173/AnsiballZ_modprobe.py'
Nov 25 06:14:15 compute-0 sudo[170029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:16 compute-0 python3.9[170031]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 25 06:14:16 compute-0 kernel: Key type psk registered
Nov 25 06:14:16 compute-0 sudo[170029]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:16 compute-0 sudo[170193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mghqaowcpnnfovlkdmngvifbnvusrbsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051256.2314186-630-172091453857824/AnsiballZ_stat.py'
Nov 25 06:14:16 compute-0 sudo[170193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:16 compute-0 python3.9[170195]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:16 compute-0 sudo[170193]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:16 compute-0 sudo[170316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewnbldlqtbjzdemuwwrizgktawhutjap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051256.2314186-630-172091453857824/AnsiballZ_copy.py'
Nov 25 06:14:16 compute-0 sudo[170316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:16 compute-0 python3.9[170318]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051256.2314186-630-172091453857824/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:16 compute-0 sudo[170316]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:17 compute-0 sudo[170468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asylrilevihxoafhsjttxtkjxlcrliiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051257.0754535-646-97546237692329/AnsiballZ_lineinfile.py'
Nov 25 06:14:17 compute-0 sudo[170468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:17 compute-0 python3.9[170470]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:17 compute-0 sudo[170468]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:17 compute-0 sudo[170620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwyzcknzvxycjbeltjyblvtwnhqolglj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051257.5066085-654-241155327070896/AnsiballZ_systemd.py'
Nov 25 06:14:17 compute-0 sudo[170620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:17 compute-0 python3.9[170622]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:14:17 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 25 06:14:17 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 25 06:14:17 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 25 06:14:17 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 25 06:14:17 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 25 06:14:17 compute-0 sudo[170620]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:18 compute-0 sudo[170776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjyzzabbaococyopobdsdaszvmlctvsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051258.1185877-662-2271226637535/AnsiballZ_dnf.py'
Nov 25 06:14:18 compute-0 sudo[170776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:18 compute-0 python3.9[170778]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 06:14:20 compute-0 systemd[1]: Reloading.
Nov 25 06:14:20 compute-0 podman[170785]: 2025-11-25 06:14:20.549186963 +0000 UTC m=+0.051676273 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 25 06:14:20 compute-0 systemd-sysv-generator[170822]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:20 compute-0 systemd-rc-local-generator[170819]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:20 compute-0 systemd[1]: Reloading.
Nov 25 06:14:20 compute-0 systemd-sysv-generator[170857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:20 compute-0 systemd-rc-local-generator[170854]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:20 compute-0 systemd-logind[744]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 25 06:14:21 compute-0 systemd-logind[744]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 25 06:14:21 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 06:14:21 compute-0 systemd[1]: Reloading.
Nov 25 06:14:21 compute-0 systemd-rc-local-generator[170964]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:21 compute-0 systemd-sysv-generator[170967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 06:14:21 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 06:14:21 compute-0 sudo[170776]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:21 compute-0 sudo[171861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guisphjcpehqappmkvfihlusekucfkfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051261.8093908-670-129135503774123/AnsiballZ_systemd_service.py'
Nov 25 06:14:22 compute-0 sudo[171861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:22 compute-0 python3.9[171890]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:14:22 compute-0 iscsid[160699]: iscsid shutting down.
Nov 25 06:14:22 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 25 06:14:22 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 25 06:14:22 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 25 06:14:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 06:14:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 06:14:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.044s CPU time.
Nov 25 06:14:22 compute-0 systemd[1]: run-r09f3b69710f9408d90cc6f44cdce20ce.service: Deactivated successfully.
Nov 25 06:14:22 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 25 06:14:22 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 25 06:14:22 compute-0 systemd[1]: Started Open-iSCSI.
Nov 25 06:14:22 compute-0 sudo[171861]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:22 compute-0 python3.9[172416]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:14:23 compute-0 sudo[172570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayvpggrrqninspjebkzirrxoiszuizxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051263.209748-688-270216396164951/AnsiballZ_file.py'
Nov 25 06:14:23 compute-0 sudo[172570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:23 compute-0 python3.9[172572]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:23 compute-0 sudo[172570]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:23 compute-0 sudo[172722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsljdeghjhytbdedkcmbfiycslljgyse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051263.7815492-699-65584546390863/AnsiballZ_systemd_service.py'
Nov 25 06:14:23 compute-0 sudo[172722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:24 compute-0 python3.9[172724]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:14:24 compute-0 systemd[1]: Reloading.
Nov 25 06:14:24 compute-0 systemd-rc-local-generator[172745]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:24 compute-0 systemd-sysv-generator[172748]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:24 compute-0 sudo[172722]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:24 compute-0 python3.9[172909]: ansible-ansible.builtin.service_facts Invoked
Nov 25 06:14:24 compute-0 network[172926]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 06:14:24 compute-0 network[172927]: 'network-scripts' will be removed from distribution in near future.
Nov 25 06:14:24 compute-0 network[172928]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 06:14:27 compute-0 sudo[173200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mywtiefpzeqivlnxedeexgsvwmxpgdkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051266.9976745-718-83047821383430/AnsiballZ_systemd_service.py'
Nov 25 06:14:27 compute-0 sudo[173200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:27 compute-0 python3.9[173202]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:27 compute-0 sudo[173200]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:27 compute-0 sudo[173353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycfubrgsjddlymebywilebosvvjfkxsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051267.5228777-718-28615446569054/AnsiballZ_systemd_service.py'
Nov 25 06:14:27 compute-0 sudo[173353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:27 compute-0 python3.9[173355]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:27 compute-0 sudo[173353]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:28 compute-0 sudo[173506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixlnlakkpvlfqukesnhqgvrjulvajxpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051268.0358143-718-163055309331479/AnsiballZ_systemd_service.py'
Nov 25 06:14:28 compute-0 sudo[173506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:28 compute-0 python3.9[173508]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:28 compute-0 sudo[173506]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:28 compute-0 sudo[173659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxhhxgiqikwcudzbplaetxxowtomvibj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051268.5494783-718-86730215953248/AnsiballZ_systemd_service.py'
Nov 25 06:14:28 compute-0 sudo[173659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:28 compute-0 python3.9[173661]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:28 compute-0 sudo[173659]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:29 compute-0 sudo[173812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjwlsyvyfzfhwmwemqxjskkpxiktkziz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051269.069109-718-241458873108584/AnsiballZ_systemd_service.py'
Nov 25 06:14:29 compute-0 sudo[173812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:29 compute-0 python3.9[173814]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:29 compute-0 sudo[173812]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:29 compute-0 sudo[173965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anokmsyptpjylfhviycmgwcpfdmuwgcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051269.5868654-718-1208717768828/AnsiballZ_systemd_service.py'
Nov 25 06:14:29 compute-0 sudo[173965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:29 compute-0 python3.9[173967]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:30 compute-0 sudo[173965]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:30 compute-0 sudo[174118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqofkbowtmkvxrwokbaotbwjmsdddmoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051270.1034265-718-31296722956835/AnsiballZ_systemd_service.py'
Nov 25 06:14:30 compute-0 sudo[174118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:30 compute-0 python3.9[174120]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:30 compute-0 sudo[174118]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:30 compute-0 sudo[174271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jirwskwwbcdxzxopsfrgjqtqvxebspcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051270.6315956-718-270069303398359/AnsiballZ_systemd_service.py'
Nov 25 06:14:30 compute-0 sudo[174271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:31 compute-0 python3.9[174273]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:14:31 compute-0 sudo[174271]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:31 compute-0 sudo[174424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvskfomjjpqmjnsxmhptgbfglxosivvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051271.2783942-777-214443489258875/AnsiballZ_file.py'
Nov 25 06:14:31 compute-0 sudo[174424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:31 compute-0 python3.9[174426]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:31 compute-0 sudo[174424]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:31 compute-0 sudo[174576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqsnnjqbdgrrrtfzipkylvfrxpbbzoqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051271.7047765-777-272777166251097/AnsiballZ_file.py'
Nov 25 06:14:31 compute-0 sudo[174576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:32 compute-0 python3.9[174578]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:32 compute-0 sudo[174576]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:32 compute-0 sudo[174728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svskhshsgylhsyahhzficuhubzbvylot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051272.1196923-777-250660501933731/AnsiballZ_file.py'
Nov 25 06:14:32 compute-0 sudo[174728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:32 compute-0 python3.9[174730]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:32 compute-0 sudo[174728]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:32 compute-0 sudo[174880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czhwibngfutloqkbpxjfsfkqyenqcoaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051272.545086-777-75199397413573/AnsiballZ_file.py'
Nov 25 06:14:32 compute-0 sudo[174880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:32 compute-0 python3.9[174882]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:32 compute-0 sudo[174880]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:33 compute-0 sudo[175032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjwdzeacvhkobyofkrsrbezbehguultz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051272.9612534-777-113407392864293/AnsiballZ_file.py'
Nov 25 06:14:33 compute-0 sudo[175032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:33 compute-0 python3.9[175034]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:33 compute-0 sudo[175032]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:33 compute-0 sudo[175184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwlplpuewrxsphqxluimfyzyvsccnqhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051273.3748193-777-144799075979985/AnsiballZ_file.py'
Nov 25 06:14:33 compute-0 sudo[175184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:33 compute-0 python3.9[175186]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:33 compute-0 sudo[175184]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:33 compute-0 sudo[175336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgldwxsrojevhwiqtiawbscsvchyqkag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051273.798612-777-266167308238656/AnsiballZ_file.py'
Nov 25 06:14:33 compute-0 sudo[175336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:34 compute-0 python3.9[175338]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:34 compute-0 sudo[175336]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:34 compute-0 sudo[175488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lutrxwdtgqfzhhdwlbdzozhzwwydniac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051274.2228575-777-155335705504030/AnsiballZ_file.py'
Nov 25 06:14:34 compute-0 sudo[175488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:34 compute-0 python3.9[175490]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:34 compute-0 sudo[175488]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:34 compute-0 sudo[175640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgurfpfpyhvyydxxxmedusudnncouzhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051274.6717257-834-90754379436912/AnsiballZ_file.py'
Nov 25 06:14:34 compute-0 sudo[175640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:34 compute-0 python3.9[175642]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:35 compute-0 sudo[175640]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:35 compute-0 sudo[175792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifbauelzgbklvlemrlzpmyircivnkkts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051275.0880013-834-115478498248311/AnsiballZ_file.py'
Nov 25 06:14:35 compute-0 sudo[175792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:35 compute-0 python3.9[175794]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:35 compute-0 sudo[175792]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:35 compute-0 sudo[175944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tohpvtcjglmuvzkuvawbrowlsjiyjqrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051275.5056877-834-69698063710563/AnsiballZ_file.py'
Nov 25 06:14:35 compute-0 sudo[175944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:35 compute-0 python3.9[175946]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:35 compute-0 sudo[175944]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:36 compute-0 sudo[176096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpieejzitnszskvclwpvvltqvuhxopgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051275.9282806-834-154812117954026/AnsiballZ_file.py'
Nov 25 06:14:36 compute-0 sudo[176096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:36 compute-0 python3.9[176098]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:36 compute-0 sudo[176096]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:36 compute-0 sudo[176248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjlbqyjtwsqzrljspbcvtaxininaqbob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051276.3395765-834-271525483697170/AnsiballZ_file.py'
Nov 25 06:14:36 compute-0 sudo[176248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:36 compute-0 python3.9[176250]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:36 compute-0 sudo[176248]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:36 compute-0 sudo[176400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqyuyqmbjoouptibiqcrjhfclbphippy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051276.7471964-834-17775503426049/AnsiballZ_file.py'
Nov 25 06:14:36 compute-0 sudo[176400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:37 compute-0 python3.9[176402]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:37 compute-0 sudo[176400]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:37 compute-0 sudo[176552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqpmcdbqcjfbtciehnqesrqatscoqfae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051277.1539736-834-12225546217620/AnsiballZ_file.py'
Nov 25 06:14:37 compute-0 sudo[176552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:37 compute-0 python3.9[176554]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:37 compute-0 sudo[176552]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:37 compute-0 sudo[176704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpppnhhmtbzvqtcdlspsijvprqpqrkft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051277.5598223-834-27618673961995/AnsiballZ_file.py'
Nov 25 06:14:37 compute-0 sudo[176704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:37 compute-0 python3.9[176706]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:37 compute-0 sudo[176704]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:38 compute-0 sudo[176856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-semsqnjzpsxmptfpdjxnyumjcmcsgnwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051278.0441365-892-3027255235272/AnsiballZ_command.py'
Nov 25 06:14:38 compute-0 sudo[176856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:38 compute-0 python3.9[176858]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:38 compute-0 sudo[176856]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:38 compute-0 python3.9[177010]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 06:14:39 compute-0 sudo[177160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onmlfgwcsyjxecnghhcmtzvuhxgzzyeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051279.1547117-910-179648293644953/AnsiballZ_systemd_service.py'
Nov 25 06:14:39 compute-0 sudo[177160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:39 compute-0 python3.9[177162]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:14:39 compute-0 systemd[1]: Reloading.
Nov 25 06:14:39 compute-0 systemd-sysv-generator[177186]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:14:39 compute-0 systemd-rc-local-generator[177183]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:14:39 compute-0 sudo[177160]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:40 compute-0 sudo[177347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjstoapekltsmgicusccibleuovprecb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051279.9530008-918-137039773820306/AnsiballZ_command.py'
Nov 25 06:14:40 compute-0 sudo[177347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:40 compute-0 python3.9[177349]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:40 compute-0 sudo[177347]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:40 compute-0 sudo[177509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiaftootmayzkmpduclbhiuebweyhzdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051280.3849869-918-245368800653349/AnsiballZ_command.py'
Nov 25 06:14:40 compute-0 sudo[177509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:40 compute-0 podman[177474]: 2025-11-25 06:14:40.59823723 +0000 UTC m=+0.061707950 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:14:40 compute-0 python3.9[177518]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:40 compute-0 sudo[177509]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:41 compute-0 sudo[177676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcsrneokrpdbbyeeyycjgxdciwfqjslb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051280.937764-918-53444508616479/AnsiballZ_command.py'
Nov 25 06:14:41 compute-0 sudo[177676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:41 compute-0 python3.9[177678]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:41 compute-0 sudo[177676]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:41 compute-0 sudo[177829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slhbrlykdteqrypjzygofydrwsuuwlwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051281.3624413-918-24816886736488/AnsiballZ_command.py'
Nov 25 06:14:41 compute-0 sudo[177829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:41 compute-0 python3.9[177831]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:41 compute-0 sudo[177829]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:41 compute-0 sudo[177982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhjfwflhgrlkvrtcoafcfkoansahsdfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051281.7887607-918-105072005504039/AnsiballZ_command.py'
Nov 25 06:14:41 compute-0 sudo[177982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:42 compute-0 python3.9[177984]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:42 compute-0 sudo[177982]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:42 compute-0 sudo[178135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lryputbuxqmufcjsbwbkqoeoljixmwkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051282.2252514-918-223298888814422/AnsiballZ_command.py'
Nov 25 06:14:42 compute-0 sudo[178135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:42 compute-0 python3.9[178137]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:42 compute-0 sudo[178135]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:42 compute-0 sudo[178288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwpzkqbzwsnpblijoamhjfkgguqbvnks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051282.654876-918-96071289232548/AnsiballZ_command.py'
Nov 25 06:14:42 compute-0 sudo[178288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:42 compute-0 python3.9[178290]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:43 compute-0 sudo[178288]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:43 compute-0 sudo[178441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqzjksxpedydtbmryostxiqftaakzvwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051283.092748-918-280255756337709/AnsiballZ_command.py'
Nov 25 06:14:43 compute-0 sudo[178441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:43 compute-0 python3.9[178443]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:14:43 compute-0 sudo[178441]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:44 compute-0 sudo[178594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsxibavzexgnyfdrskvnpfzhlfdfpmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051284.2286086-997-240294560373319/AnsiballZ_file.py'
Nov 25 06:14:44 compute-0 sudo[178594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:44 compute-0 python3.9[178596]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:44 compute-0 sudo[178594]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:44 compute-0 sudo[178756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mabuwmgnolltcmyazpzmxdlncfynkgyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051284.6729636-997-54317538043252/AnsiballZ_file.py'
Nov 25 06:14:44 compute-0 sudo[178756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:44 compute-0 podman[178720]: 2025-11-25 06:14:44.869258219 +0000 UTC m=+0.045916800 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3)
Nov 25 06:14:45 compute-0 python3.9[178764]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:45 compute-0 sudo[178756]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:45 compute-0 sudo[178915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvfsagfifpnldzourxtaibrwfqczrfgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051285.1288583-997-34598888887171/AnsiballZ_file.py'
Nov 25 06:14:45 compute-0 sudo[178915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:45 compute-0 python3.9[178917]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:45 compute-0 sudo[178915]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:45 compute-0 sudo[179067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inivotjsiofzxtzybsqhmkjdypeplyyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051285.5980883-1019-264366900325673/AnsiballZ_file.py'
Nov 25 06:14:45 compute-0 sudo[179067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:45 compute-0 python3.9[179069]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:45 compute-0 sudo[179067]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:46 compute-0 sudo[179219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldeqbfmbywcmlrpoxoaelwxyqfpunzey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051286.0422268-1019-263776255963880/AnsiballZ_file.py'
Nov 25 06:14:46 compute-0 sudo[179219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:46 compute-0 python3.9[179221]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:46 compute-0 sudo[179219]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:46 compute-0 sudo[179371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvdoakmtzrqvfkeysklkplggupqudmmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051286.48244-1019-48182694994862/AnsiballZ_file.py'
Nov 25 06:14:46 compute-0 sudo[179371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:46 compute-0 python3.9[179373]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:46 compute-0 sudo[179371]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:14:47.068 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:14:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:14:47.068 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:14:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:14:47.068 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:14:47 compute-0 sudo[179523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pinhmcbmaaajpepqjtvoifgxhuksaixd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051286.9117265-1019-122300721329415/AnsiballZ_file.py'
Nov 25 06:14:47 compute-0 sudo[179523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:47 compute-0 python3.9[179526]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:47 compute-0 sudo[179523]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:47 compute-0 sudo[179676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivgmmofacztgvevabszfrpqkrvjwfctp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051287.333445-1019-190152361627906/AnsiballZ_file.py'
Nov 25 06:14:47 compute-0 sudo[179676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:47 compute-0 python3.9[179678]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:47 compute-0 sudo[179676]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:47 compute-0 sudo[179828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsbhvpzdhbcuejwdkdvfgbhbyruevvmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051287.7602122-1019-8400585866080/AnsiballZ_file.py'
Nov 25 06:14:47 compute-0 sudo[179828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:48 compute-0 python3.9[179830]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:48 compute-0 sudo[179828]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:48 compute-0 sudo[179980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxxnhkdirhemcbddzqapbkapsoxtcdpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051288.183025-1019-66397386279477/AnsiballZ_file.py'
Nov 25 06:14:48 compute-0 sudo[179980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:48 compute-0 python3.9[179982]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:48 compute-0 sudo[179980]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:51 compute-0 podman[180007]: 2025-11-25 06:14:51.055211175 +0000 UTC m=+0.034293668 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 06:14:52 compute-0 sudo[180149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owqggahscsjueaaeymqapvdcrorapyiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051291.7120495-1188-46735607441392/AnsiballZ_getent.py'
Nov 25 06:14:52 compute-0 sudo[180149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:52 compute-0 python3.9[180151]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 25 06:14:52 compute-0 sudo[180149]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:52 compute-0 sudo[180302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqhskisunrnjqxcoiptbddavrpmmowsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051292.3177116-1196-129621188465895/AnsiballZ_group.py'
Nov 25 06:14:52 compute-0 sudo[180302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:52 compute-0 python3.9[180304]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 06:14:52 compute-0 groupadd[180305]: group added to /etc/group: name=nova, GID=42436
Nov 25 06:14:52 compute-0 groupadd[180305]: group added to /etc/gshadow: name=nova
Nov 25 06:14:52 compute-0 groupadd[180305]: new group: name=nova, GID=42436
Nov 25 06:14:52 compute-0 sudo[180302]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:53 compute-0 sudo[180460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzddzsitegymjejuwygvwlxukqkjxuvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051292.936173-1204-170413766942108/AnsiballZ_user.py'
Nov 25 06:14:53 compute-0 sudo[180460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:53 compute-0 python3.9[180462]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 06:14:53 compute-0 useradd[180464]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 25 06:14:53 compute-0 useradd[180464]: add 'nova' to group 'libvirt'
Nov 25 06:14:53 compute-0 useradd[180464]: add 'nova' to shadow group 'libvirt'
Nov 25 06:14:53 compute-0 sudo[180460]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:54 compute-0 sshd-session[180495]: Accepted publickey for zuul from 192.168.122.30 port 39672 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:14:54 compute-0 systemd-logind[744]: New session 24 of user zuul.
Nov 25 06:14:54 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 25 06:14:54 compute-0 sshd-session[180495]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:14:54 compute-0 sshd-session[180498]: Received disconnect from 192.168.122.30 port 39672:11: disconnected by user
Nov 25 06:14:54 compute-0 sshd-session[180498]: Disconnected from user zuul 192.168.122.30 port 39672
Nov 25 06:14:54 compute-0 sshd-session[180495]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:14:54 compute-0 systemd-logind[744]: Session 24 logged out. Waiting for processes to exit.
Nov 25 06:14:54 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 25 06:14:54 compute-0 systemd-logind[744]: Removed session 24.
Nov 25 06:14:54 compute-0 python3.9[180648]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:55 compute-0 python3.9[180769]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051294.4301374-1229-128395890009371/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:55 compute-0 python3.9[180919]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:55 compute-0 python3.9[180995]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:56 compute-0 python3.9[181145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:56 compute-0 python3.9[181266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051295.9631443-1229-256822861682868/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:57 compute-0 python3.9[181416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:57 compute-0 python3.9[181537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051296.8009787-1229-2070067240647/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:57 compute-0 python3.9[181687]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:58 compute-0 python3.9[181808]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051297.5840075-1229-51472922926698/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:58 compute-0 python3.9[181958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:14:58 compute-0 python3.9[182079]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051298.3283806-1229-234513860588954/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:14:59 compute-0 sudo[182229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjnwqofabkrlmeacpifrkkuxtbgvluol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051299.1326997-1312-198386999855028/AnsiballZ_file.py'
Nov 25 06:14:59 compute-0 sudo[182229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:14:59 compute-0 python3.9[182231]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:14:59 compute-0 sudo[182229]: pam_unix(sudo:session): session closed for user root
Nov 25 06:14:59 compute-0 sudo[182381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmseeddgvpozbuoodzaiekdchlsjaxwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051299.710038-1320-269650909457378/AnsiballZ_copy.py'
Nov 25 06:14:59 compute-0 sudo[182381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:00 compute-0 python3.9[182383]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:00 compute-0 sudo[182381]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:00 compute-0 sudo[182533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnchbzfmxvseeiizxspioylfrnpjrlsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051300.1509418-1328-153681819839990/AnsiballZ_stat.py'
Nov 25 06:15:00 compute-0 sudo[182533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:00 compute-0 python3.9[182535]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:00 compute-0 sudo[182533]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:00 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 25 06:15:00 compute-0 sudo[182686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upzfluvedogpxhujhmigdmsekrpvpexo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051300.6037493-1336-111427620683094/AnsiballZ_stat.py'
Nov 25 06:15:00 compute-0 sudo[182686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:00 compute-0 python3.9[182688]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:00 compute-0 sudo[182686]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:01 compute-0 sudo[182809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnoarfyijmnmeqvlledwupwtkdoffxxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051300.6037493-1336-111427620683094/AnsiballZ_copy.py'
Nov 25 06:15:01 compute-0 sudo[182809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:01 compute-0 python3.9[182811]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764051300.6037493-1336-111427620683094/.source _original_basename=.c_1_etzx follow=False checksum=16c080880ec4d26ac7ed60cbeb26406c955bdcd6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 25 06:15:01 compute-0 sudo[182809]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:01 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 06:15:01 compute-0 python3.9[182964]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:02 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 25 06:15:02 compute-0 python3.9[183116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:02 compute-0 python3.9[183238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051301.9842327-1362-171449999533736/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=bb909f10b8576514194106a8b798e7483d2b2f0c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:03 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 25 06:15:03 compute-0 python3.9[183388]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:03 compute-0 python3.9[183510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051302.7916925-1377-131880483528960/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=1076182e34a72b1d6afe193fd4bc28ec9aaa1dc2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:03 compute-0 sudo[183660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkloaojugfuivfmlfwmqpcngsmbzhbuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051303.7038324-1394-247125285195466/AnsiballZ_container_config_data.py'
Nov 25 06:15:03 compute-0 sudo[183660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:04 compute-0 python3.9[183662]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 25 06:15:04 compute-0 sudo[183660]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:04 compute-0 sudo[183812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikhjmllxbxmtcaapiihyfiulgdjhxewz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051304.2131069-1403-129308199351036/AnsiballZ_container_config_hash.py'
Nov 25 06:15:04 compute-0 sudo[183812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:04 compute-0 python3.9[183814]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:15:04 compute-0 sudo[183812]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:04 compute-0 sudo[183964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igwvmeunbjznssrhmexozyvzlnlziqah ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051304.7813575-1413-246627285393796/AnsiballZ_edpm_container_manage.py'
Nov 25 06:15:04 compute-0 sudo[183964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:05 compute-0 python3[183966]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:15:05 compute-0 podman[183995]: 2025-11-25 06:15:05.288454486 +0000 UTC m=+0.029030235 container create d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute_init, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, container_name=nova_compute_init)
Nov 25 06:15:05 compute-0 podman[183995]: 2025-11-25 06:15:05.274603793 +0000 UTC m=+0.015179552 image pull b5a49b8af9b6d4308f9036b8ada850f2911f350781c3ddf60dd55cecb3543ff2 quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:15:05 compute-0 python3[183966]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 25 06:15:05 compute-0 sudo[183964]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:05 compute-0 sudo[184172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yihtvaktpglclvolykajlqtvyjrgtrtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051305.4956381-1421-100210217655489/AnsiballZ_stat.py'
Nov 25 06:15:05 compute-0 sudo[184172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:05 compute-0 python3.9[184174]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:05 compute-0 sudo[184172]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:06 compute-0 sudo[184326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adjcvirseguekznzldmmescqictkcasr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051306.1456122-1433-163633251171465/AnsiballZ_container_config_data.py'
Nov 25 06:15:06 compute-0 sudo[184326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:06 compute-0 python3.9[184328]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 25 06:15:06 compute-0 sudo[184326]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:06 compute-0 sudo[184478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzowjmksuudrhqphnkurgzycjxnbqnqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051306.6557448-1442-27576562561482/AnsiballZ_container_config_hash.py'
Nov 25 06:15:06 compute-0 sudo[184478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:06 compute-0 python3.9[184480]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:15:07 compute-0 sudo[184478]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:07 compute-0 sudo[184630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqhfblfqfxtqbesuupplumrqnsxhhyxj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051307.2070007-1452-115252214907232/AnsiballZ_edpm_container_manage.py'
Nov 25 06:15:07 compute-0 sudo[184630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:07 compute-0 python3[184632]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:15:07 compute-0 podman[184658]: 2025-11-25 06:15:07.72029213 +0000 UTC m=+0.027142979 container create c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 25 06:15:07 compute-0 podman[184658]: 2025-11-25 06:15:07.707470577 +0000 UTC m=+0.014321426 image pull b5a49b8af9b6d4308f9036b8ada850f2911f350781c3ddf60dd55cecb3543ff2 quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:15:07 compute-0 python3[184632]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78 kolla_start
Nov 25 06:15:07 compute-0 sudo[184630]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:08 compute-0 sudo[184835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-althgoghobtkgvophlffsrzbreceyynd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051307.928334-1460-143165267879177/AnsiballZ_stat.py'
Nov 25 06:15:08 compute-0 sudo[184835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:08 compute-0 python3.9[184837]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:08 compute-0 sudo[184835]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:08 compute-0 sudo[184989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwmwcuryxjdcsygdquxnriyroerenlbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051308.451064-1469-102611482569431/AnsiballZ_file.py'
Nov 25 06:15:08 compute-0 sudo[184989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:08 compute-0 python3.9[184991]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:08 compute-0 sudo[184989]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:09 compute-0 sudo[185140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsyntfhwabowcxoycajlwzudkbehjjev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051308.8348372-1469-186017120504317/AnsiballZ_copy.py'
Nov 25 06:15:09 compute-0 sudo[185140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:09 compute-0 python3.9[185142]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764051308.8348372-1469-186017120504317/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:09 compute-0 sudo[185140]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:09 compute-0 sudo[185216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlncqckjuglnaelndvnfqlwhkdxaemfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051308.8348372-1469-186017120504317/AnsiballZ_systemd.py'
Nov 25 06:15:09 compute-0 sudo[185216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:09 compute-0 python3.9[185218]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:15:09 compute-0 systemd[1]: Reloading.
Nov 25 06:15:09 compute-0 systemd-sysv-generator[185242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:15:09 compute-0 systemd-rc-local-generator[185239]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:15:09 compute-0 sudo[185216]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:10 compute-0 sudo[185327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmsnkxkobymraljavpnrvntrmrfplqvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051308.8348372-1469-186017120504317/AnsiballZ_systemd.py'
Nov 25 06:15:10 compute-0 sudo[185327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:10 compute-0 python3.9[185329]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:15:10 compute-0 systemd[1]: Reloading.
Nov 25 06:15:10 compute-0 systemd-rc-local-generator[185355]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:15:10 compute-0 systemd-sysv-generator[185358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:15:10 compute-0 systemd[1]: Starting nova_compute container...
Nov 25 06:15:10 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:15:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:10 compute-0 podman[185369]: 2025-11-25 06:15:10.610695716 +0000 UTC m=+0.066126022 container init c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 25 06:15:10 compute-0 podman[185369]: 2025-11-25 06:15:10.615740173 +0000 UTC m=+0.071170469 container start c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Nov 25 06:15:10 compute-0 podman[185369]: nova_compute
Nov 25 06:15:10 compute-0 nova_compute[185381]: + sudo -E kolla_set_configs
Nov 25 06:15:10 compute-0 systemd[1]: Started nova_compute container.
Nov 25 06:15:10 compute-0 sudo[185327]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Validating config file
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying service configuration files
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Deleting /etc/ceph
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Creating directory /etc/ceph
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /etc/ceph
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Writing out command to execute
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:10 compute-0 nova_compute[185381]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 06:15:10 compute-0 nova_compute[185381]: ++ cat /run_command
Nov 25 06:15:10 compute-0 nova_compute[185381]: + CMD=nova-compute
Nov 25 06:15:10 compute-0 nova_compute[185381]: + ARGS=
Nov 25 06:15:10 compute-0 nova_compute[185381]: + sudo kolla_copy_cacerts
Nov 25 06:15:10 compute-0 podman[185387]: 2025-11-25 06:15:10.701644718 +0000 UTC m=+0.063016929 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:15:10 compute-0 nova_compute[185381]: + [[ ! -n '' ]]
Nov 25 06:15:10 compute-0 nova_compute[185381]: + . kolla_extend_start
Nov 25 06:15:10 compute-0 nova_compute[185381]: Running command: 'nova-compute'
Nov 25 06:15:10 compute-0 nova_compute[185381]: + echo 'Running command: '\''nova-compute'\'''
Nov 25 06:15:10 compute-0 nova_compute[185381]: + umask 0022
Nov 25 06:15:10 compute-0 nova_compute[185381]: + exec nova-compute
Nov 25 06:15:11 compute-0 python3.9[185565]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:11 compute-0 python3.9[185715]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.321 185385 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.321 185385 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.321 185385 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.322 185385 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.359 185385 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.368 185385 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.369 185385 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:423
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.399 185385 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Nov 25 06:15:12 compute-0 nova_compute[185381]: 2025-11-25 06:15:12.401 185385 WARNING oslo_config.cfg [None req-d26622c1-8bb4-4a58-921c-08447230024e - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Nov 25 06:15:12 compute-0 python3.9[185867]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:13 compute-0 sudo[186019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxktuzynrquezwhckrodjhkedmzxkaaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051312.6363132-1529-51128778938897/AnsiballZ_podman_container.py'
Nov 25 06:15:13 compute-0 sudo[186019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:13 compute-0 python3.9[186021]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 25 06:15:13 compute-0 sudo[186019]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:13 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 06:15:13 compute-0 nova_compute[185381]: 2025-11-25 06:15:13.398 185385 INFO nova.virt.driver [None req-d26622c1-8bb4-4a58-921c-08447230024e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 25 06:15:13 compute-0 nova_compute[185381]: 2025-11-25 06:15:13.490 185385 INFO nova.compute.provider_config [None req-d26622c1-8bb4-4a58-921c-08447230024e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 25 06:15:13 compute-0 sudo[186192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbbfrwtetbaizrufykvwlkguqjchjylb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051313.4707563-1537-159560102954022/AnsiballZ_systemd.py'
Nov 25 06:15:13 compute-0 sudo[186192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:13 compute-0 python3.9[186194]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:15:13 compute-0 systemd[1]: Stopping nova_compute container...
Nov 25 06:15:13 compute-0 systemd[1]: libpod-c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927.scope: Deactivated successfully.
Nov 25 06:15:13 compute-0 systemd[1]: libpod-c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927.scope: Consumed 1.915s CPU time.
Nov 25 06:15:13 compute-0 conmon[185381]: conmon c274c47ddf58663bc1e7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927.scope/container/memory.events
Nov 25 06:15:13 compute-0 podman[186198]: 2025-11-25 06:15:13.978617612 +0000 UTC m=+0.038556555 container died c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, container_name=nova_compute, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 25 06:15:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927-userdata-shm.mount: Deactivated successfully.
Nov 25 06:15:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511-merged.mount: Deactivated successfully.
Nov 25 06:15:14 compute-0 podman[186198]: 2025-11-25 06:15:14.009592149 +0000 UTC m=+0.069531092 container cleanup c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:15:14 compute-0 podman[186198]: nova_compute
Nov 25 06:15:14 compute-0 podman[186220]: nova_compute
Nov 25 06:15:14 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 25 06:15:14 compute-0 systemd[1]: Stopped nova_compute container.
Nov 25 06:15:14 compute-0 systemd[1]: Starting nova_compute container...
Nov 25 06:15:14 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8405b89de538de82f87eb6ea7d73b12308a8edfb39e073b7548657e7c0c5c511/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 podman[186229]: 2025-11-25 06:15:14.152343893 +0000 UTC m=+0.076762974 container init c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 06:15:14 compute-0 podman[186229]: 2025-11-25 06:15:14.156803694 +0000 UTC m=+0.081222764 container start c274c47ddf58663bc1e7826348a96fae574523e972a6865adf8f26422da4f927 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:15:14 compute-0 podman[186229]: nova_compute
Nov 25 06:15:14 compute-0 nova_compute[186241]: + sudo -E kolla_set_configs
Nov 25 06:15:14 compute-0 systemd[1]: Started nova_compute container.
Nov 25 06:15:14 compute-0 sudo[186192]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Validating config file
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying service configuration files
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/ceph
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Creating directory /etc/ceph
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/ceph
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Writing out command to execute
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:14 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 06:15:14 compute-0 nova_compute[186241]: ++ cat /run_command
Nov 25 06:15:14 compute-0 nova_compute[186241]: + CMD=nova-compute
Nov 25 06:15:14 compute-0 nova_compute[186241]: + ARGS=
Nov 25 06:15:14 compute-0 nova_compute[186241]: + sudo kolla_copy_cacerts
Nov 25 06:15:14 compute-0 nova_compute[186241]: + [[ ! -n '' ]]
Nov 25 06:15:14 compute-0 nova_compute[186241]: + . kolla_extend_start
Nov 25 06:15:14 compute-0 nova_compute[186241]: + echo 'Running command: '\''nova-compute'\'''
Nov 25 06:15:14 compute-0 nova_compute[186241]: Running command: 'nova-compute'
Nov 25 06:15:14 compute-0 nova_compute[186241]: + umask 0022
Nov 25 06:15:14 compute-0 nova_compute[186241]: + exec nova-compute
Nov 25 06:15:14 compute-0 sudo[186402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqdntlgsmxjpraucltztustiyjubwsyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051314.3427522-1546-253597634268617/AnsiballZ_podman_container.py'
Nov 25 06:15:14 compute-0 sudo[186402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:14 compute-0 python3.9[186404]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 25 06:15:14 compute-0 systemd[1]: Started libpod-conmon-d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128.scope.
Nov 25 06:15:14 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5359a4fcac41d7f023e1269a3459cdf8413bbbb20533b51c4624a58b60057b80/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5359a4fcac41d7f023e1269a3459cdf8413bbbb20533b51c4624a58b60057b80/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5359a4fcac41d7f023e1269a3459cdf8413bbbb20533b51c4624a58b60057b80/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:14 compute-0 podman[186423]: 2025-11-25 06:15:14.903573287 +0000 UTC m=+0.092254977 container init d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute_init, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Nov 25 06:15:14 compute-0 podman[186423]: 2025-11-25 06:15:14.909166464 +0000 UTC m=+0.097848144 container start d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute_init, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3)
Nov 25 06:15:14 compute-0 python3.9[186404]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Applying nova statedir ownership
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 25 06:15:14 compute-0 nova_compute_init[186450]: INFO:nova_statedir:Nova statedir ownership complete
Nov 25 06:15:14 compute-0 systemd[1]: libpod-d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128.scope: Deactivated successfully.
Nov 25 06:15:14 compute-0 podman[186438]: 2025-11-25 06:15:14.982177911 +0000 UTC m=+0.093233392 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 25 06:15:15 compute-0 podman[186468]: 2025-11-25 06:15:15.003786212 +0000 UTC m=+0.030192206 container died d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 06:15:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128-userdata-shm.mount: Deactivated successfully.
Nov 25 06:15:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-5359a4fcac41d7f023e1269a3459cdf8413bbbb20533b51c4624a58b60057b80-merged.mount: Deactivated successfully.
Nov 25 06:15:15 compute-0 podman[186468]: 2025-11-25 06:15:15.02376686 +0000 UTC m=+0.050172834 container cleanup d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128 (image=quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=nova_compute_init, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-nova-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:15:15 compute-0 sudo[186402]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:15 compute-0 systemd[1]: libpod-conmon-d36c68a1424bb4729bbf5263b47075d687cf1bad042eba8d90327994c494c128.scope: Deactivated successfully.
Nov 25 06:15:15 compute-0 sshd-session[158461]: Connection closed by 192.168.122.30 port 53262
Nov 25 06:15:15 compute-0 sshd-session[158458]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:15:15 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 25 06:15:15 compute-0 systemd[1]: session-23.scope: Consumed 1min 17.165s CPU time.
Nov 25 06:15:15 compute-0 systemd-logind[744]: Session 23 logged out. Waiting for processes to exit.
Nov 25 06:15:15 compute-0 systemd-logind[744]: Removed session 23.
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.851 186245 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.851 186245 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.852 186245 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.852 186245 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.889 186245 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.899 186245 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.899 186245 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:423
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.931 186245 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Nov 25 06:15:15 compute-0 nova_compute[186241]: 2025-11-25 06:15:15.933 186245 WARNING oslo_config.cfg [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Nov 25 06:15:16 compute-0 nova_compute[186241]: 2025-11-25 06:15:16.886 186245 INFO nova.virt.driver [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 25 06:15:16 compute-0 nova_compute[186241]: 2025-11-25 06:15:16.977 186245 INFO nova.compute.provider_config [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.482 186245 DEBUG oslo_concurrency.lockutils [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.483 186245 DEBUG oslo_concurrency.lockutils [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.483 186245 DEBUG oslo_concurrency.lockutils [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.484 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/service.py:357
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.484 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.484 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.484 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.484 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.485 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.485 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.485 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.485 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.486 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.486 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.486 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.486 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.487 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.487 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.487 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.487 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.487 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.488 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.488 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.488 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.488 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.489 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.489 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.489 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.489 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.490 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.490 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.490 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.490 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.491 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.491 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.491 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.491 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.492 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.492 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.492 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.492 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.493 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.493 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.493 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.493 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.494 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.494 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.494 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.494 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.495 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.495 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.495 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.495 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.496 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.496 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.496 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.496 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.497 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.497 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.497 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.497 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.498 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.498 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.498 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.498 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.498 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.499 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.499 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.499 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.499 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.500 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.500 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.500 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.500 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.500 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.501 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.501 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.501 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.501 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.502 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.502 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.502 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.502 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.503 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.503 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.503 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.503 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.503 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.504 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.504 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.504 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.504 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.505 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.505 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.505 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.505 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.506 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.506 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.506 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.506 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.507 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.507 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.507 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.507 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.508 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.508 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.508 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.508 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.509 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.509 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.509 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.509 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.510 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.510 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.510 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.510 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.510 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.511 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.511 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.511 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.511 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.512 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.512 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.512 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.512 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.512 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.513 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.513 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.513 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.513 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.514 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.514 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.514 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.514 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.514 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.515 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.515 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.515 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.515 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.516 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.516 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.516 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.516 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.517 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.517 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.517 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.517 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.518 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.518 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.518 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.518 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.518 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.519 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.519 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.519 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.519 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.520 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.520 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.520 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.520 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.521 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.521 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.521 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.521 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.521 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.522 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.522 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.522 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.522 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.523 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.523 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.523 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.523 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.524 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.524 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.524 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.524 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.525 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.525 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.525 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.525 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.526 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.526 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.526 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.526 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.527 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.527 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.527 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.527 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.527 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.528 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.528 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.528 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.528 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.529 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.529 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.529 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.529 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.530 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.530 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.530 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.530 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.531 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.531 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.531 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.531 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.531 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.532 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.532 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.532 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.532 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.533 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.533 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.533 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.533 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.534 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.534 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.534 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.534 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.534 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.535 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.535 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.535 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.535 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.536 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.536 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.536 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.536 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.537 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.537 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.537 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.537 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.538 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.538 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.538 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.538 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.538 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.539 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.539 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.539 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.539 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.540 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.540 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.540 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.540 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.541 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.541 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.541 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.541 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.541 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.542 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.542 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.542 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.542 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.543 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.543 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.543 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.543 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.544 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.544 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.544 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.544 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.544 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.545 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.545 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.545 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.545 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.546 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.546 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.546 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.546 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.547 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.547 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.547 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.547 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.548 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.548 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.548 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.548 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.548 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.549 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.549 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.549 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.549 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.550 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.550 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.550 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.550 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.551 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.551 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.551 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.551 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.551 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.552 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.552 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.552 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.552 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.553 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.553 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.553 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.553 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.554 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.554 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.554 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.554 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.555 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.555 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.555 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.555 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.555 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.556 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.556 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.556 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.556 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.557 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.557 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.557 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.557 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.558 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.558 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.558 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.558 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.558 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.559 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.559 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.559 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.559 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.560 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.560 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.560 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.560 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.561 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.561 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.561 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.561 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.561 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.562 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.562 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.562 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.562 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.563 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.563 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.563 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.563 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.564 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.564 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.564 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.564 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.565 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.565 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.565 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.565 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.565 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.566 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.566 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.566 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.566 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.567 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.567 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.567 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.567 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.568 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.568 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.568 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.568 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.568 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.569 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.569 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.569 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.569 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.570 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.570 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.570 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.570 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.571 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.571 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.571 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.571 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.571 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.572 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.572 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.572 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.572 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.573 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.573 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.573 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.574 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.574 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.574 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.574 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.575 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.575 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.575 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.575 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.575 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.576 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.576 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.576 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.576 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.577 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.577 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.577 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.577 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.578 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.578 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.578 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.578 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.578 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.579 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.579 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.579 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.579 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.580 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.580 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.580 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.580 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.581 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.581 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.581 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.581 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.581 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.582 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.582 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.582 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.582 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.583 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.583 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.583 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.583 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.584 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.584 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.584 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.584 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.585 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.585 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.585 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.585 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.585 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.586 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.586 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.586 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.586 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.587 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.587 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.587 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.587 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.588 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.588 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.588 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.588 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.589 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.589 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.589 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.589 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.589 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.590 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.590 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.590 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.590 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.591 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.591 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.591 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.591 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.592 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.592 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.592 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.592 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.592 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.593 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.593 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.593 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.593 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.594 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.594 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.594 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.594 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.595 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.595 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.595 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.595 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.596 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.596 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.596 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.596 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.597 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.597 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.597 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.597 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.598 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.598 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.598 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.598 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.598 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.599 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.599 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.599 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.599 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.600 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.600 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.600 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.601 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.601 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.601 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.601 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.602 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.602 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.602 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.602 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.603 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.603 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.603 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.603 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.604 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.604 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.604 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.604 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.604 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.605 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.605 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.605 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.605 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.606 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.606 186245 WARNING oslo_config.cfg [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 25 06:15:17 compute-0 nova_compute[186241]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 25 06:15:17 compute-0 nova_compute[186241]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 25 06:15:17 compute-0 nova_compute[186241]: and ``live_migration_inbound_addr`` respectively.
Nov 25 06:15:17 compute-0 nova_compute[186241]: ).  Its value may be silently ignored in the future.
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.606 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.607 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.607 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.607 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.608 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.608 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.608 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.609 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.609 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.609 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.609 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.609 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.610 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.610 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.610 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.610 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.611 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.611 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.611 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.611 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.612 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.612 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.612 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.612 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.613 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.613 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.613 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.613 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.614 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.614 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.614 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.614 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.615 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.615 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.615 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.615 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.616 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.616 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.616 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.616 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.617 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.617 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.617 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.617 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.618 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.618 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.618 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.618 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.619 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.619 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.619 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.619 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.620 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.620 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.620 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.620 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.621 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.621 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.621 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.621 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.621 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.622 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.622 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.622 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.622 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.623 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.623 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.623 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.623 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.624 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.624 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.624 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.624 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.624 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.625 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.625 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.625 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.625 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.626 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.626 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.626 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.626 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.627 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.627 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.627 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.627 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.628 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.628 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.628 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.628 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.628 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.629 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.629 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.629 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.629 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.630 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.630 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.630 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.630 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.631 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.631 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.631 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.631 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.632 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.632 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.632 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.632 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.632 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.633 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.633 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.633 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.633 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.634 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.634 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.634 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.634 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.635 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.635 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.635 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.635 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.635 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.636 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.636 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.636 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.636 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.637 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.637 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.637 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.637 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.638 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.638 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.638 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.638 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.639 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.639 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.639 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.639 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.639 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.640 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.640 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.640 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.640 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.641 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.641 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.641 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.641 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.642 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.642 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.642 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.642 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.643 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.643 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.643 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.643 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.643 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.644 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.644 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.644 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.644 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.645 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.645 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.645 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.645 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.646 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.646 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.646 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.646 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.647 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.647 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.647 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.647 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.647 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.648 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.648 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.648 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.648 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.649 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.649 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.649 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.649 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.649 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.650 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.650 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.650 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.650 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.651 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.651 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.651 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.651 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.652 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.652 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.652 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.652 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.653 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.653 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.653 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.653 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.654 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.654 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.654 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.654 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.655 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.655 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.655 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.655 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.655 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.656 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.656 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.656 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.657 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.657 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.657 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.657 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.657 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.658 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.658 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.658 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.658 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.659 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.659 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.659 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.659 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.660 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.660 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.660 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.660 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.661 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.661 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.661 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.661 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.661 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.662 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.662 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.662 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.662 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.663 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.663 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.663 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.663 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.664 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.664 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.664 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.664 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.664 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.665 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.665 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.665 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.665 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.666 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.666 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.666 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.666 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.667 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.667 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.667 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.667 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.668 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.668 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.668 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.668 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.669 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.669 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.669 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.669 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.670 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.670 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.670 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.670 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.671 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.671 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.671 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.671 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.672 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.672 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.672 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.672 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.672 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.673 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.673 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.673 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.673 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.674 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.674 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.674 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.674 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.675 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.675 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.675 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.675 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.676 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.676 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.676 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.676 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.676 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.677 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.677 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.677 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.677 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.678 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.678 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.678 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.678 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.679 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.679 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.679 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.679 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.680 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.680 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.680 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.680 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.681 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.681 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.681 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.681 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.681 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.682 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.682 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.682 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.682 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.683 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.683 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.683 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.683 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.684 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.684 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.684 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.684 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.684 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.685 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.685 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.685 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.685 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.686 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.686 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.686 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.686 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.687 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.687 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.687 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.687 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.688 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.688 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.688 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.688 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.688 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.689 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.689 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.689 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.689 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.690 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.690 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.690 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.690 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.691 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.691 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.691 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.691 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.691 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.692 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.692 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.692 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.692 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.693 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.693 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.693 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.693 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.694 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.694 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.694 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.694 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.694 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.695 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.695 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.695 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.695 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.696 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.696 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.696 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.696 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.696 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.697 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.697 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.697 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.697 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.698 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.698 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.698 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.698 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.699 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.699 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.699 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.699 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.700 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.700 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.700 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.700 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.700 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.701 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.701 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.701 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.701 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.702 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.702 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.702 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.702 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.703 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.703 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.703 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.703 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.704 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.704 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.704 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.704 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.705 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.705 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.705 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.705 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.705 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.706 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.706 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.706 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.706 186245 DEBUG oslo_service.backend.eventlet.service [None req-6fcd4e0d-ccad-4e23-a979-c624668e7e2b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Nov 25 06:15:17 compute-0 nova_compute[186241]: 2025-11-25 06:15:17.707 186245 INFO nova.service [-] Starting compute node (version 31.1.0-0.20250428102727.3e7017e.el9)
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.212 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:495
Nov 25 06:15:18 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 25 06:15:18 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.260 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7faeec7280> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:501
Nov 25 06:15:18 compute-0 nova_compute[186241]: libvirt:  error : internal error: could not initialize domain event timer
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.261 186245 WARNING nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.261 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7faeec7280> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:522
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.263 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:481
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.263 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:487
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.264 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:490
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.264 186245 INFO nova.virt.libvirt.driver [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Connection event '1' reason 'None'
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.768 186245 WARNING nova.virt.libvirt.driver [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.768 186245 DEBUG nova.virt.libvirt.volume.mount [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.929 186245 INFO nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Libvirt host capabilities <capabilities>
Nov 25 06:15:18 compute-0 nova_compute[186241]: 
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <host>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <uuid>372a8332-b89e-4c47-aaaf-012dba1b43d0</uuid>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <cpu>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <arch>x86_64</arch>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model>EPYC-Milan-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <microcode version='167776725'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <signature family='25' model='1' stepping='1'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <topology sockets='4' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <maxphysaddr mode='emulate' bits='48'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='x2apic'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='tsc-deadline'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='osxsave'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='hypervisor'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='tsc_adjust'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='ospke'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='vaes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='vpclmulqdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='spec-ctrl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='stibp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='arch-capabilities'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='ssbd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='cmp_legacy'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='virt-ssbd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='lbrv'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='tsc-scale'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='vmcb-clean'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='pause-filter'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='pfthreshold'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='v-vmsave-vmload'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='vgif'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='rdctl-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='skip-l1dfl-vmentry'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='mds-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature name='pschange-mc-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <pages unit='KiB' size='4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <pages unit='KiB' size='2048'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <pages unit='KiB' size='1048576'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </cpu>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <power_management>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <suspend_mem/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <suspend_disk/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <suspend_hybrid/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </power_management>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <iommu support='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <migration_features>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <live/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <uri_transports>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <uri_transport>tcp</uri_transport>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <uri_transport>rdma</uri_transport>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </uri_transports>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </migration_features>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <topology>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <cells num='1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <cell id='0'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           <memory unit='KiB'>7865372</memory>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           <pages unit='KiB' size='4'>1966343</pages>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           <pages unit='KiB' size='2048'>0</pages>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           <distances>
Nov 25 06:15:18 compute-0 nova_compute[186241]:             <sibling id='0' value='10'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           </distances>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           <cpus num='4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:           </cpus>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         </cell>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </cells>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </topology>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <cache>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </cache>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <secmodel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model>selinux</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <doi>0</doi>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </secmodel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <secmodel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model>dac</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <doi>0</doi>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </secmodel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </host>
Nov 25 06:15:18 compute-0 nova_compute[186241]: 
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <guest>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <os_type>hvm</os_type>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <arch name='i686'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <wordsize>32</wordsize>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <domain type='qemu'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <domain type='kvm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </arch>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <features>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <pae/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <nonpae/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <acpi default='on' toggle='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <apic default='on' toggle='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <cpuselection/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <deviceboot/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <disksnapshot default='on' toggle='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <externalSnapshot/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </features>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </guest>
Nov 25 06:15:18 compute-0 nova_compute[186241]: 
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <guest>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <os_type>hvm</os_type>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <arch name='x86_64'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <wordsize>64</wordsize>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <domain type='qemu'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <domain type='kvm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </arch>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <features>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <acpi default='on' toggle='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <apic default='on' toggle='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <cpuselection/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <deviceboot/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <disksnapshot default='on' toggle='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <externalSnapshot/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </features>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </guest>
Nov 25 06:15:18 compute-0 nova_compute[186241]: 
Nov 25 06:15:18 compute-0 nova_compute[186241]: </capabilities>
Nov 25 06:15:18 compute-0 nova_compute[186241]: 
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.933 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:941
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.946 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 25 06:15:18 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <arch>i686</arch>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <vcpu max='240'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <enum name='firmware'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </loader>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </os>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <cpu>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='48'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vaes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='v-vmsave-vmload'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vgif'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <value>file</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>ide</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>none</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </video>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </hostdev>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>random</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>path</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </filesystem>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>external</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </tpm>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </redirdev>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </channel>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </crypto>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </panic>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>null</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>file</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </console>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <features>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <defaults>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </defaults>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </hyperv>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </features>
Nov 25 06:15:18 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 25 06:15:18 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Nov 25 06:15:18 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.951 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 25 06:15:18 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <arch>i686</arch>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <vcpu max='4096'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <enum name='firmware'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </loader>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   </os>
Nov 25 06:15:18 compute-0 nova_compute[186241]:   <cpu>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='48'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vaes'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='v-vmsave-vmload'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='vgif'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:18 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 06:15:18 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:18 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>file</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>none</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </video>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </hostdev>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>random</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>path</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </filesystem>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>external</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </tpm>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </redirdev>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </channel>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </crypto>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </panic>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>null</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>file</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </console>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <features>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <defaults>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </defaults>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </hyperv>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </features>
Nov 25 06:15:19 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 25 06:15:19 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.952 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:941
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.954 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 25 06:15:19 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <arch>x86_64</arch>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <vcpu max='240'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <enum name='firmware'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </loader>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </os>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <cpu>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='48'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vaes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='v-vmsave-vmload'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vgif'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>file</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>ide</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>none</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </video>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </hostdev>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>random</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>path</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </filesystem>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>external</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </tpm>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </redirdev>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </channel>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </crypto>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </panic>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>null</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>file</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </console>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <features>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <defaults>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </defaults>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </hyperv>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </features>
Nov 25 06:15:19 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 25 06:15:19 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:18.996 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 25 06:15:19 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <arch>x86_64</arch>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <vcpu max='4096'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <enum name='firmware'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>efi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>no</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </loader>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </os>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <cpu>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>on</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>off</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='48'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vaes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='v-vmsave-vmload'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='vgif'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </blockers>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </mode>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>file</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>none</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </video>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </hostdev>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>random</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>path</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </filesystem>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>external</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </tpm>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </redirdev>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </channel>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </crypto>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>default</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </panic>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>null</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>file</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </console>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   <features>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <defaults>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </defaults>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </hyperv>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 25 06:15:19 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 25 06:15:19 compute-0 nova_compute[186241]:       </enum>
Nov 25 06:15:19 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 25 06:15:19 compute-0 nova_compute[186241]:   </features>
Nov 25 06:15:19 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 25 06:15:19 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1026
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.036 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1874
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.036 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1874
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.036 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1874
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.036 186245 INFO nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Secure Boot support detected
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.040 186245 INFO nova.virt.libvirt.driver [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.041 186245 INFO nova.virt.libvirt.driver [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.233 186245 DEBUG nova.virt.libvirt.driver [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1197
Nov 25 06:15:19 compute-0 nova_compute[186241]: 2025-11-25 06:15:19.742 186245 INFO nova.virt.node [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Determined node identity b9b31722-b833-4ea1-a013-247935742e36 from /var/lib/nova/compute_id
Nov 25 06:15:20 compute-0 nova_compute[186241]: 2025-11-25 06:15:20.249 186245 WARNING nova.compute.manager [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Compute nodes ['b9b31722-b833-4ea1-a013-247935742e36'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 25 06:15:20 compute-0 sshd-session[186605]: Accepted publickey for zuul from 192.168.122.30 port 38384 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:15:20 compute-0 systemd-logind[744]: New session 25 of user zuul.
Nov 25 06:15:20 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 25 06:15:20 compute-0 sshd-session[186605]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:15:21 compute-0 podman[186732]: 2025-11-25 06:15:21.184968809 +0000 UTC m=+0.041180936 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:15:21 compute-0 nova_compute[186241]: 2025-11-25 06:15:21.256 186245 INFO nova.compute.manager [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 25 06:15:21 compute-0 python3.9[186767]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.268 186245 WARNING nova.compute.manager [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.269 186245 DEBUG oslo_concurrency.lockutils [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.269 186245 DEBUG oslo_concurrency.lockutils [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.269 186245 DEBUG oslo_concurrency.lockutils [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.269 186245 DEBUG nova.compute.resource_tracker [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:15:22 compute-0 sudo[186928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfkscuslxwqbrstnhlmksjdjluzittax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051321.9229543-36-168699677859572/AnsiballZ_systemd_service.py'
Nov 25 06:15:22 compute-0 sudo[186928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.470 186245 WARNING nova.virt.libvirt.driver [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.471 186245 DEBUG nova.compute.resource_tracker [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6227MB free_disk=73.2220687866211GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.471 186245 DEBUG oslo_concurrency.lockutils [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.471 186245 DEBUG oslo_concurrency.lockutils [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:15:22 compute-0 python3.9[186930]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:15:22 compute-0 systemd[1]: Reloading.
Nov 25 06:15:22 compute-0 systemd-sysv-generator[186954]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:15:22 compute-0 systemd-rc-local-generator[186951]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:15:22 compute-0 sudo[186928]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:22 compute-0 nova_compute[186241]: 2025-11-25 06:15:22.974 186245 WARNING nova.compute.resource_tracker [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] No compute node record for compute-0.ctlplane.example.com:b9b31722-b833-4ea1-a013-247935742e36: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host b9b31722-b833-4ea1-a013-247935742e36 could not be found.
Nov 25 06:15:23 compute-0 python3.9[187115]: ansible-ansible.builtin.service_facts Invoked
Nov 25 06:15:23 compute-0 network[187132]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 06:15:23 compute-0 network[187133]: 'network-scripts' will be removed from distribution in near future.
Nov 25 06:15:23 compute-0 network[187134]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 06:15:23 compute-0 nova_compute[186241]: 2025-11-25 06:15:23.478 186245 INFO nova.compute.resource_tracker [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: b9b31722-b833-4ea1-a013-247935742e36
Nov 25 06:15:24 compute-0 nova_compute[186241]: 2025-11-25 06:15:24.992 186245 DEBUG nova.compute.resource_tracker [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:15:24 compute-0 nova_compute[186241]: 2025-11-25 06:15:24.992 186245 DEBUG nova.compute.resource_tracker [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:15:25 compute-0 sudo[187406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifozhjhjqpawpzojhgxgapqtjwxbcbuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051325.6178489-55-218643138167760/AnsiballZ_systemd_service.py'
Nov 25 06:15:25 compute-0 sudo[187406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:26 compute-0 python3.9[187408]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:15:26 compute-0 nova_compute[186241]: 2025-11-25 06:15:26.051 186245 INFO nova.scheduler.client.report [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] [req-e3cab15c-d157-48ca-95f3-c5f26337a150] Created resource provider record via placement API for resource provider with UUID b9b31722-b833-4ea1-a013-247935742e36 and name compute-0.ctlplane.example.com.
Nov 25 06:15:26 compute-0 sudo[187406]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:26 compute-0 nova_compute[186241]: 2025-11-25 06:15:26.564 186245 DEBUG nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 25 06:15:26 compute-0 nova_compute[186241]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1946
Nov 25 06:15:26 compute-0 nova_compute[186241]: 2025-11-25 06:15:26.564 186245 INFO nova.virt.libvirt.host [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] kernel doesn't support AMD SEV
Nov 25 06:15:26 compute-0 nova_compute[186241]: 2025-11-25 06:15:26.565 186245 DEBUG nova.compute.provider_tree [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:15:26 compute-0 nova_compute[186241]: 2025-11-25 06:15:26.565 186245 DEBUG nova.virt.libvirt.driver [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:15:26 compute-0 sudo[187559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izhyvklbltgenzajvvnuhephpsxchidm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051326.2888074-65-94404878798892/AnsiballZ_file.py'
Nov 25 06:15:26 compute-0 sudo[187559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:26 compute-0 python3.9[187561]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:26 compute-0 sudo[187559]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:26 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 06:15:26 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 06:15:27 compute-0 sudo[187712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrmfyvhyfjkfenbwshjfdqtwmjyywscg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051326.9182558-73-231338129293104/AnsiballZ_file.py'
Nov 25 06:15:27 compute-0 sudo[187712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.096 186245 DEBUG nova.scheduler.client.report [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Updated inventory for provider b9b31722-b833-4ea1-a013-247935742e36 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:975
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.097 186245 DEBUG nova.compute.provider_tree [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Updating resource provider b9b31722-b833-4ea1-a013-247935742e36 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.097 186245 DEBUG nova.compute.provider_tree [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.183 186245 DEBUG nova.compute.provider_tree [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Updating resource provider b9b31722-b833-4ea1-a013-247935742e36 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 25 06:15:27 compute-0 python3.9[187714]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:27 compute-0 sudo[187712]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.688 186245 DEBUG nova.compute.resource_tracker [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.688 186245 DEBUG oslo_concurrency.lockutils [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.688 186245 DEBUG nova.service [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:177
Nov 25 06:15:27 compute-0 sudo[187864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgfeprffzorrwibebdtebjyrgplgktzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051327.4194744-82-200557561067076/AnsiballZ_command.py'
Nov 25 06:15:27 compute-0 sudo[187864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.722 186245 DEBUG nova.service [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:194
Nov 25 06:15:27 compute-0 nova_compute[186241]: 2025-11-25 06:15:27.723 186245 DEBUG nova.servicegroup.drivers.db [None req-97995a6b-a389-4a76-8a37-affea7b37372 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 25 06:15:27 compute-0 python3.9[187866]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:15:27 compute-0 sudo[187864]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:28 compute-0 python3.9[188018]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 06:15:28 compute-0 sudo[188168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlykhcbbmrweieqnnrmcxjkeqgpkipff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051328.5784197-100-216081158577568/AnsiballZ_systemd_service.py'
Nov 25 06:15:28 compute-0 sudo[188168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:29 compute-0 python3.9[188170]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:15:29 compute-0 systemd[1]: Reloading.
Nov 25 06:15:29 compute-0 systemd-rc-local-generator[188190]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:15:29 compute-0 systemd-sysv-generator[188194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:15:29 compute-0 sudo[188168]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:29 compute-0 sudo[188355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqsyczqxnilmsfhhdwoelwahhrngheyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051329.3397605-108-162260812334220/AnsiballZ_command.py'
Nov 25 06:15:29 compute-0 sudo[188355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:29 compute-0 python3.9[188357]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:15:29 compute-0 sudo[188355]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:30 compute-0 sudo[188508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhbeeoeqmxdnppevjkuxboizcudrkcyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051329.8334415-117-2368933781026/AnsiballZ_file.py'
Nov 25 06:15:30 compute-0 sudo[188508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:30 compute-0 python3.9[188510]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:30 compute-0 sudo[188508]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:30 compute-0 python3.9[188660]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:31 compute-0 python3.9[188812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:31 compute-0 python3.9[188933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051330.8462374-133-76859902763403/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:32 compute-0 sudo[189083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiqbjqbwyglixwqgdrcuhbukzsakvtwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051331.7595909-148-176815324683717/AnsiballZ_group.py'
Nov 25 06:15:32 compute-0 sudo[189083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:32 compute-0 python3.9[189085]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 25 06:15:32 compute-0 sudo[189083]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:32 compute-0 sudo[189235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teahtvaheaweprseinkvvnvwtiytuiht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051332.4450767-159-187741763962695/AnsiballZ_getent.py'
Nov 25 06:15:32 compute-0 sudo[189235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:32 compute-0 python3.9[189237]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 25 06:15:32 compute-0 sudo[189235]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:33 compute-0 sudo[189388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-govcvwndmgjgmkmqqsruldfvbomdzhfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051333.011119-167-17129182460023/AnsiballZ_group.py'
Nov 25 06:15:33 compute-0 sudo[189388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:33 compute-0 python3.9[189390]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 06:15:33 compute-0 groupadd[189391]: group added to /etc/group: name=ceilometer, GID=42405
Nov 25 06:15:33 compute-0 groupadd[189391]: group added to /etc/gshadow: name=ceilometer
Nov 25 06:15:33 compute-0 groupadd[189391]: new group: name=ceilometer, GID=42405
Nov 25 06:15:33 compute-0 sudo[189388]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:33 compute-0 sudo[189546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dflxrongnlkigwkbxevmxgyethhyhjwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051333.4759724-175-24321015509078/AnsiballZ_user.py'
Nov 25 06:15:33 compute-0 sudo[189546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:33 compute-0 python3.9[189548]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 06:15:33 compute-0 useradd[189550]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 06:15:33 compute-0 useradd[189550]: add 'ceilometer' to group 'libvirt'
Nov 25 06:15:33 compute-0 useradd[189550]: add 'ceilometer' to shadow group 'libvirt'
Nov 25 06:15:34 compute-0 sudo[189546]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:34 compute-0 python3.9[189706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:35 compute-0 python3.9[189827]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764051334.5512133-201-183031390805047/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:35 compute-0 python3.9[189977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:35 compute-0 python3.9[190098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764051335.3242128-201-258644685107406/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:36 compute-0 python3.9[190248]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:36 compute-0 python3.9[190369]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764051336.0806513-201-106793534488022/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:37 compute-0 python3.9[190519]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:37 compute-0 python3.9[190671]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:38 compute-0 python3.9[190823]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:38 compute-0 python3.9[190944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051337.7146955-260-172098487988930/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:38 compute-0 python3.9[191094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:39 compute-0 python3.9[191170]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:39 compute-0 python3.9[191320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:39 compute-0 python3.9[191441]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051339.193895-260-48858948403154/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=94807b28565e5c35f729a8200fb9b68afc4fc12a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:40 compute-0 python3.9[191591]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:40 compute-0 python3.9[191712]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051339.9724145-260-88511891389444/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:40 compute-0 podman[191836]: 2025-11-25 06:15:40.953065629 +0000 UTC m=+0.055848920 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125)
Nov 25 06:15:41 compute-0 python3.9[191871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:41 compute-0 python3.9[192006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051340.7479033-260-207950922568495/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:41 compute-0 python3.9[192156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:42 compute-0 python3.9[192277]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051341.5482008-260-103404459009856/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=3820eb6e48c35431ebf53228213a5d51b7591223 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:42 compute-0 python3.9[192427]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:42 compute-0 python3.9[192548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051342.3234363-260-123281504262842/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:43 compute-0 python3.9[192698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:43 compute-0 python3.9[192819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051343.087953-260-256013406613677/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=33df3bf08923ad9105770f5abb51d4cde791931a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:44 compute-0 python3.9[192969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:44 compute-0 python3.9[193090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051343.8557389-260-208696155876771/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:44 compute-0 python3.9[193240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:45 compute-0 podman[193294]: 2025-11-25 06:15:45.075987126 +0000 UTC m=+0.059557282 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Nov 25 06:15:45 compute-0 python3.9[193378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051344.6009684-260-30939163480271/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=8bed8129af2c9145e8d37569bb493c0de1895d6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:45 compute-0 python3.9[193528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:46 compute-0 python3.9[193649]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051345.3439326-260-125780829918744/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:46 compute-0 python3.9[193799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:46 compute-0 python3.9[193875]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:15:47.128 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:15:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:15:47.128 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:15:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:15:47.129 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:15:47 compute-0 python3.9[194025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:47 compute-0 python3.9[194102]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:48 compute-0 python3.9[194252]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:48 compute-0 python3.9[194328]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:48 compute-0 sudo[194478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oylpryhjneghkmfcyxqiscpqphzluhbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051348.4292343-449-132389403949664/AnsiballZ_file.py'
Nov 25 06:15:48 compute-0 sudo[194478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:48 compute-0 python3.9[194480]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:48 compute-0 sudo[194478]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:49 compute-0 sudo[194630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozphxawvvarqulycsqgixbckixmhqilb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051348.8760705-457-10913852571267/AnsiballZ_file.py'
Nov 25 06:15:49 compute-0 sudo[194630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:49 compute-0 python3.9[194632]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:49 compute-0 sudo[194630]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:49 compute-0 sudo[194782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tghoijiutoviaondeukkphqtkevbizym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051349.3198738-465-125118577465816/AnsiballZ_file.py'
Nov 25 06:15:49 compute-0 sudo[194782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:49 compute-0 python3.9[194784]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:49 compute-0 sudo[194782]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:49 compute-0 sudo[194934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pceiwkdhdkhgbijacefqhitewfgutzuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051349.768655-473-108714641630444/AnsiballZ_systemd_service.py'
Nov 25 06:15:49 compute-0 sudo[194934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:50 compute-0 python3.9[194936]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:15:50 compute-0 systemd[1]: Reloading.
Nov 25 06:15:50 compute-0 systemd-rc-local-generator[194958]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:15:50 compute-0 systemd-sysv-generator[194962]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:15:50 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 25 06:15:50 compute-0 sudo[194934]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:50 compute-0 sudo[195124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkrqwwctutrxmzfbgrkcxhekrikuxlzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051350.675843-482-165457134760778/AnsiballZ_stat.py'
Nov 25 06:15:50 compute-0 sudo[195124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:51 compute-0 python3.9[195126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:51 compute-0 sudo[195124]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:51 compute-0 sudo[195259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmxmjpbhxfxzfkpyxczaleeuueoteyhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051350.675843-482-165457134760778/AnsiballZ_copy.py'
Nov 25 06:15:51 compute-0 sudo[195259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:51 compute-0 podman[195221]: 2025-11-25 06:15:51.247956173 +0000 UTC m=+0.036713304 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 06:15:51 compute-0 python3.9[195265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051350.675843-482-165457134760778/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:51 compute-0 sudo[195259]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:51 compute-0 sudo[195339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnbmqhijuemraijxqmjvvuqdqygycwby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051350.675843-482-165457134760778/AnsiballZ_stat.py'
Nov 25 06:15:51 compute-0 sudo[195339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:51 compute-0 python3.9[195341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:51 compute-0 sudo[195339]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:51 compute-0 sudo[195462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfvknsukfaavxgmvzbbttyjjyhwfobad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051350.675843-482-165457134760778/AnsiballZ_copy.py'
Nov 25 06:15:51 compute-0 sudo[195462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:52 compute-0 python3.9[195464]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051350.675843-482-165457134760778/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:52 compute-0 sudo[195462]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:52 compute-0 sudo[195614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baetmcpqgyjkwcsdfommejefxarnndly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051352.3128037-510-140903503814376/AnsiballZ_container_config_data.py'
Nov 25 06:15:52 compute-0 sudo[195614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:52 compute-0 python3.9[195616]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 25 06:15:52 compute-0 sudo[195614]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:53 compute-0 sudo[195766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iopqnizkimwnmhaxhprftcptzmxwzdgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051353.0393178-519-91334139159590/AnsiballZ_container_config_hash.py'
Nov 25 06:15:53 compute-0 sudo[195766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:53 compute-0 python3.9[195768]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:15:53 compute-0 sudo[195766]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:54 compute-0 sudo[195918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyofmsbhpvzwzuoxpnsinmefgeovndtd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051353.7326038-529-219622176847517/AnsiballZ_edpm_container_manage.py'
Nov 25 06:15:54 compute-0 sudo[195918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:54 compute-0 python3[195920]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:15:54 compute-0 podman[195949]: 2025-11-25 06:15:54.443197824 +0000 UTC m=+0.030047838 container create d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:15:54 compute-0 podman[195949]: 2025-11-25 06:15:54.429472659 +0000 UTC m=+0.016322684 image pull 884992811c8175ee05276a13464176221fd628ef0f4b26c22d3021b5f1aa08da quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:15:54 compute-0 python3[195920]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78 kolla_start
Nov 25 06:15:54 compute-0 sudo[195918]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:54 compute-0 sudo[196126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onqpidqvozpndvykofnucmirekirtmbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051354.6413336-537-112935210265495/AnsiballZ_stat.py'
Nov 25 06:15:54 compute-0 sudo[196126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:54 compute-0 python3.9[196128]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:15:54 compute-0 sudo[196126]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:55 compute-0 sudo[196280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfxrkgefcglkfexuwjlznclsvgqmrpdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051355.1577244-546-37450903588363/AnsiballZ_file.py'
Nov 25 06:15:55 compute-0 sudo[196280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:55 compute-0 python3.9[196282]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:55 compute-0 sudo[196280]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:55 compute-0 sudo[196431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uszcqtfarxzyhwqeuqbnsrqfseajsqdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051355.5389378-546-223940749781945/AnsiballZ_copy.py'
Nov 25 06:15:55 compute-0 sudo[196431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:55 compute-0 python3.9[196433]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764051355.5389378-546-223940749781945/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:15:56 compute-0 sudo[196431]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:56 compute-0 sudo[196507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nthwcvjztdmgmarkyhzezxeozoitoenl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051355.5389378-546-223940749781945/AnsiballZ_systemd.py'
Nov 25 06:15:56 compute-0 sudo[196507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:56 compute-0 python3.9[196509]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:15:56 compute-0 systemd[1]: Reloading.
Nov 25 06:15:56 compute-0 nova_compute[186241]: 2025-11-25 06:15:56.725 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:15:56 compute-0 systemd-rc-local-generator[196530]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:15:56 compute-0 systemd-sysv-generator[196533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:15:56 compute-0 sudo[196507]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:57 compute-0 sudo[196618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twlxditpkpnlyejqqcedwudtvurxngwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051355.5389378-546-223940749781945/AnsiballZ_systemd.py'
Nov 25 06:15:57 compute-0 sudo[196618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:57 compute-0 nova_compute[186241]: 2025-11-25 06:15:57.230 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:15:57 compute-0 python3.9[196620]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:15:57 compute-0 systemd[1]: Reloading.
Nov 25 06:15:57 compute-0 systemd-rc-local-generator[196647]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:15:57 compute-0 systemd-sysv-generator[196650]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:15:57 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 25 06:15:57 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:15:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:57 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.
Nov 25 06:15:57 compute-0 podman[196660]: 2025-11-25 06:15:57.718768808 +0000 UTC m=+0.075460283 container init d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + sudo -E kolla_set_configs
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: sudo: unable to send audit message: Operation not permitted
Nov 25 06:15:57 compute-0 sudo[196678]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 25 06:15:57 compute-0 sudo[196678]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:15:57 compute-0 sudo[196678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 25 06:15:57 compute-0 podman[196660]: 2025-11-25 06:15:57.743130756 +0000 UTC m=+0.099822211 container start d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 25 06:15:57 compute-0 podman[196660]: ceilometer_agent_compute
Nov 25 06:15:57 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 25 06:15:57 compute-0 sudo[196618]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Validating config file
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Copying service configuration files
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: INFO:__main__:Writing out command to execute
Nov 25 06:15:57 compute-0 sudo[196678]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: ++ cat /run_command
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + ARGS=
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + sudo kolla_copy_cacerts
Nov 25 06:15:57 compute-0 podman[196679]: 2025-11-25 06:15:57.792557007 +0000 UTC m=+0.041874466 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:15:57 compute-0 systemd[1]: d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-871af2bb30021bf.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 06:15:57 compute-0 systemd[1]: d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-871af2bb30021bf.service: Failed with result 'exit-code'.
Nov 25 06:15:57 compute-0 sudo[196700]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: sudo: unable to send audit message: Operation not permitted
Nov 25 06:15:57 compute-0 sudo[196700]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:15:57 compute-0 sudo[196700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 25 06:15:57 compute-0 sudo[196700]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + [[ ! -n '' ]]
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + . kolla_extend_start
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + umask 0022
Nov 25 06:15:57 compute-0 ceilometer_agent_compute[196672]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 25 06:15:58 compute-0 sudo[196850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihsgjkhsilwkfvesyozagfcncgslunre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051357.8895783-570-122473338453846/AnsiballZ_systemd.py'
Nov 25 06:15:58 compute-0 sudo[196850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:58 compute-0 python3.9[196852]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:15:58 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 25 06:15:58 compute-0 systemd[1]: libpod-d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.scope: Deactivated successfully.
Nov 25 06:15:58 compute-0 podman[196858]: 2025-11-25 06:15:58.399312798 +0000 UTC m=+0.035230771 container died d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 25 06:15:58 compute-0 systemd[1]: d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-871af2bb30021bf.timer: Deactivated successfully.
Nov 25 06:15:58 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.
Nov 25 06:15:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-userdata-shm.mount: Deactivated successfully.
Nov 25 06:15:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631-merged.mount: Deactivated successfully.
Nov 25 06:15:58 compute-0 podman[196858]: 2025-11-25 06:15:58.425056191 +0000 UTC m=+0.060974144 container cleanup d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:15:58 compute-0 podman[196858]: ceilometer_agent_compute
Nov 25 06:15:58 compute-0 podman[196880]: ceilometer_agent_compute
Nov 25 06:15:58 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 25 06:15:58 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 25 06:15:58 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 25 06:15:58 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:15:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bff9805300f4a3b35f7d275ff221001414a180ab701ee83912d93eab7be6631/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 25 06:15:58 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.
Nov 25 06:15:58 compute-0 podman[196890]: 2025-11-25 06:15:58.579280413 +0000 UTC m=+0.081416933 container init d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + sudo -E kolla_set_configs
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: sudo: unable to send audit message: Operation not permitted
Nov 25 06:15:58 compute-0 sudo[196908]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 25 06:15:58 compute-0 sudo[196908]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:15:58 compute-0 sudo[196908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 25 06:15:58 compute-0 podman[196890]: 2025-11-25 06:15:58.600650951 +0000 UTC m=+0.102787462 container start d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:15:58 compute-0 podman[196890]: ceilometer_agent_compute
Nov 25 06:15:58 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 25 06:15:58 compute-0 sudo[196850]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Validating config file
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Copying service configuration files
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: INFO:__main__:Writing out command to execute
Nov 25 06:15:58 compute-0 sudo[196908]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: ++ cat /run_command
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + ARGS=
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + sudo kolla_copy_cacerts
Nov 25 06:15:58 compute-0 podman[196909]: 2025-11-25 06:15:58.659695626 +0000 UTC m=+0.045025986 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 25 06:15:58 compute-0 systemd[1]: d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-6f546720d73c7e04.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 06:15:58 compute-0 systemd[1]: d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-6f546720d73c7e04.service: Failed with result 'exit-code'.
Nov 25 06:15:58 compute-0 sudo[196932]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: sudo: unable to send audit message: Operation not permitted
Nov 25 06:15:58 compute-0 sudo[196932]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 25 06:15:58 compute-0 sudo[196932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 25 06:15:58 compute-0 sudo[196932]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + [[ ! -n '' ]]
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + . kolla_extend_start
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + umask 0022
Nov 25 06:15:58 compute-0 ceilometer_agent_compute[196902]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 25 06:15:58 compute-0 sudo[197080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbldkvnrvigbymwhbsvhtaisstuzkaze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051358.7555995-578-189194102430314/AnsiballZ_stat.py'
Nov 25 06:15:58 compute-0 sudo[197080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:59 compute-0 python3.9[197082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:15:59 compute-0 sudo[197080]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:59 compute-0 sudo[197205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vouudezpfjjxkmxklxjsyaivfsdmkzqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051358.7555995-578-189194102430314/AnsiballZ_copy.py'
Nov 25 06:15:59 compute-0 sudo[197205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.344 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.344 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.345 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.346 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 WARNING oslo_config.cfg [-] Deprecated: Option "tenant_name_discovery" from group "DEFAULT" is deprecated. Use option "identity_name_discovery" from group "DEFAULT".
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.347 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.348 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.349 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.350 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.351 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.352 2 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.353 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.354 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.355 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.356 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.357 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.374 14 INFO ceilometer.polling.manager [-] Starting heartbeat child service. Listening on /var/lib/ceilometer/ceilometer-compute.socket
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.374 14 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.375 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.375 14 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.375 14 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.375 14 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.375 14 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.376 14 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.376 14 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.376 14 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.376 14 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.376 14 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.376 14 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.376 14 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.377 14 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.377 14 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.377 14 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.377 14 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.377 14 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.377 14 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.377 14 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.378 14 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.378 14 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.378 14 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.378 14 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.378 14 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.378 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.378 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.379 14 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.379 14 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.379 14 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.379 14 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.379 14 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.379 14 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.379 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.380 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.381 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.381 14 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.381 14 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.381 14 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.381 14 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.381 14 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.381 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.382 14 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.383 14 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.384 14 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.384 14 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.384 14 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.384 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.384 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.385 14 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.385 14 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.385 14 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.385 14 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.386 14 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.386 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.386 14 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.386 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.386 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.386 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.386 16 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 14 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 14 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 14 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 14 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 14 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.388 14 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.387 16 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.388 16 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.388 14 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.388 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.388 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.388 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.388 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.389 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.389 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.389 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.389 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.389 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.389 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.389 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.390 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.391 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.391 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.391 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.391 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.391 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.391 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.391 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.392 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.392 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.392 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.392 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.392 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.392 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.392 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.393 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.393 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.393 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.393 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.393 14 DEBUG cotyledon._service [-] Run service AgentHeartBeatManager(0) [14] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.393 14 DEBUG ceilometer.polling.manager [-] Started heartbeat child process. run /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:434
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.395 14 DEBUG ceilometer.polling.manager [-] Started heartbeat update thread _read_queue /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:437
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.395 14 DEBUG ceilometer.polling.manager [-] Started heartbeat reporting thread _report_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:442
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.416 16 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:95
Nov 25 06:15:59 compute-0 python3.9[197207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051358.7555995-578-189194102430314/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:15:59 compute-0 sudo[197205]: pam_unix(sudo:session): session closed for user root
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.520 16 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.520 16 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2804
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.520 16 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2805
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.520 16 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2806
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2807
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2809
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.521 16 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.522 16 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.523 16 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.524 16 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.525 16 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.526 16 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2817
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.527 16 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.528 16 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.529 16 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.530 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.531 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.532 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.533 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2824
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2828
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.534 16 DEBUG cotyledon._service [-] Run service AgentManager(0) [16] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.536 16 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.548 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.548 16 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:95
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.557 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.557 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.557 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.557 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.557 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.557 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.558 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.558 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.558 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.558 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.558 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.558 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.558 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:15:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:15:59.559 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:15:59 compute-0 sudo[197369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvjbqqfefofpewvbmksvckxyhyecrjuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051359.7039766-595-81876982889318/AnsiballZ_container_config_data.py'
Nov 25 06:15:59 compute-0 sudo[197369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:00 compute-0 python3.9[197371]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 25 06:16:00 compute-0 sudo[197369]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:00 compute-0 sudo[197521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lldvzwnsvtzfhqtzffmdfaljzsttavhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051360.2270029-604-65551694103573/AnsiballZ_container_config_hash.py'
Nov 25 06:16:00 compute-0 sudo[197521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:00 compute-0 python3.9[197523]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:16:00 compute-0 sudo[197521]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:00 compute-0 sudo[197673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrqthikomuvbdnuajsvomaksvxhfudys ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051360.7676501-614-73345613607986/AnsiballZ_edpm_container_manage.py'
Nov 25 06:16:00 compute-0 sudo[197673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:01 compute-0 python3[197675]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:16:01 compute-0 podman[197704]: 2025-11-25 06:16:01.270165062 +0000 UTC m=+0.026470495 container create 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:16:01 compute-0 podman[197704]: 2025-11-25 06:16:01.257244423 +0000 UTC m=+0.013549847 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 25 06:16:01 compute-0 python3[197675]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 25 06:16:01 compute-0 sudo[197673]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:01 compute-0 sudo[197882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxmdjtphgaidelkvowzvvsqraveanxbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051361.4665594-622-112986475100777/AnsiballZ_stat.py'
Nov 25 06:16:01 compute-0 sudo[197882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:01 compute-0 python3.9[197884]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:16:01 compute-0 sudo[197882]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:02 compute-0 sudo[198036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asbnhyzvlpmhuhoxhicfpnslmgmwbjue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051361.9852798-631-184581361543021/AnsiballZ_file.py'
Nov 25 06:16:02 compute-0 sudo[198036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:02 compute-0 python3.9[198038]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:02 compute-0 sudo[198036]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:02 compute-0 sudo[198187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onuvkktihfgmyspyccxpnifjtckhcwwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051362.3604403-631-22817591638191/AnsiballZ_copy.py'
Nov 25 06:16:02 compute-0 sudo[198187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:02 compute-0 python3.9[198189]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764051362.3604403-631-22817591638191/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:02 compute-0 sudo[198187]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:02 compute-0 sudo[198263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hneeehhguaiabqowgylnrubvqxydacba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051362.3604403-631-22817591638191/AnsiballZ_systemd.py'
Nov 25 06:16:02 compute-0 sudo[198263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:03 compute-0 python3.9[198265]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:16:03 compute-0 systemd[1]: Reloading.
Nov 25 06:16:03 compute-0 systemd-sysv-generator[198289]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:16:03 compute-0 systemd-rc-local-generator[198286]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:16:03 compute-0 sudo[198263]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:03 compute-0 sudo[198373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsobcidzjeaxrqwwbzxluvppobssiugv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051362.3604403-631-22817591638191/AnsiballZ_systemd.py'
Nov 25 06:16:03 compute-0 sudo[198373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:03 compute-0 python3.9[198375]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:16:03 compute-0 systemd[1]: Reloading.
Nov 25 06:16:03 compute-0 systemd-sysv-generator[198400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:16:03 compute-0 systemd-rc-local-generator[198397]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:16:04 compute-0 systemd[1]: Starting node_exporter container...
Nov 25 06:16:04 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:16:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b56e80d4f10ea6524dac8cd65bb0a98d0a39f3fff4cbb948a8eb1ed10b01ff4/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b56e80d4f10ea6524dac8cd65bb0a98d0a39f3fff4cbb948a8eb1ed10b01ff4/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:04 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.
Nov 25 06:16:04 compute-0 podman[198415]: 2025-11-25 06:16:04.166876898 +0000 UTC m=+0.076292412 container init 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.175Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.175Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.175Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=arp
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=bcache
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=bonding
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=cpu
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=edac
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=filefd
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=netclass
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=netdev
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=netstat
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=nfs
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=nvme
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=softnet
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=systemd
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=xfs
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.176Z caller=node_exporter.go:117 level=info collector=zfs
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.177Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 25 06:16:04 compute-0 node_exporter[198427]: ts=2025-11-25T06:16:04.177Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 25 06:16:04 compute-0 podman[198415]: 2025-11-25 06:16:04.189221493 +0000 UTC m=+0.098636987 container start 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:16:04 compute-0 podman[198415]: node_exporter
Nov 25 06:16:04 compute-0 systemd[1]: Started node_exporter container.
Nov 25 06:16:04 compute-0 sudo[198373]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:04 compute-0 podman[198436]: 2025-11-25 06:16:04.23402534 +0000 UTC m=+0.038364668 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:16:04 compute-0 sudo[198606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdxfybhadhthhznomwccdvuhhzixoinm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051364.3331552-655-183637157152731/AnsiballZ_systemd.py'
Nov 25 06:16:04 compute-0 sudo[198606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:04 compute-0 python3.9[198608]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:16:04 compute-0 systemd[1]: Stopping node_exporter container...
Nov 25 06:16:04 compute-0 systemd[1]: libpod-0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.scope: Deactivated successfully.
Nov 25 06:16:04 compute-0 podman[198612]: 2025-11-25 06:16:04.837568664 +0000 UTC m=+0.043364615 container died 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:16:04 compute-0 systemd[1]: 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3-4b8c19e5af184b92.timer: Deactivated successfully.
Nov 25 06:16:04 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.
Nov 25 06:16:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3-userdata-shm.mount: Deactivated successfully.
Nov 25 06:16:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-9b56e80d4f10ea6524dac8cd65bb0a98d0a39f3fff4cbb948a8eb1ed10b01ff4-merged.mount: Deactivated successfully.
Nov 25 06:16:04 compute-0 podman[198612]: 2025-11-25 06:16:04.859646877 +0000 UTC m=+0.065442827 container cleanup 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:16:04 compute-0 podman[198612]: node_exporter
Nov 25 06:16:04 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 25 06:16:04 compute-0 podman[198633]: node_exporter
Nov 25 06:16:04 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 25 06:16:04 compute-0 systemd[1]: Stopped node_exporter container.
Nov 25 06:16:04 compute-0 systemd[1]: Starting node_exporter container...
Nov 25 06:16:04 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:16:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b56e80d4f10ea6524dac8cd65bb0a98d0a39f3fff4cbb948a8eb1ed10b01ff4/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b56e80d4f10ea6524dac8cd65bb0a98d0a39f3fff4cbb948a8eb1ed10b01ff4/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:04 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.
Nov 25 06:16:04 compute-0 podman[198643]: 2025-11-25 06:16:04.992646967 +0000 UTC m=+0.070994925 container init 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.001Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.001Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.001Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.001Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.001Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=arp
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=bcache
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=bonding
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=cpu
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=edac
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=filefd
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=netclass
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=netdev
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=netstat
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=nfs
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=nvme
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=softnet
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=systemd
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=xfs
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=node_exporter.go:117 level=info collector=zfs
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.002Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 25 06:16:05 compute-0 node_exporter[198655]: ts=2025-11-25T06:16:05.003Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 25 06:16:05 compute-0 podman[198643]: 2025-11-25 06:16:05.01420038 +0000 UTC m=+0.092548319 container start 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:16:05 compute-0 podman[198643]: node_exporter
Nov 25 06:16:05 compute-0 systemd[1]: Started node_exporter container.
Nov 25 06:16:05 compute-0 sudo[198606]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:05 compute-0 podman[198664]: 2025-11-25 06:16:05.061916277 +0000 UTC m=+0.035174774 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:16:05 compute-0 sudo[198834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygcmtuwhrdlolkzgygzvfhufoimovapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051365.149886-663-8609960338580/AnsiballZ_stat.py'
Nov 25 06:16:05 compute-0 sudo[198834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:05 compute-0 python3.9[198836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:05 compute-0 sudo[198834]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:05 compute-0 sudo[198957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqbudjlgvswstilqhufmcjaxejdeisl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051365.149886-663-8609960338580/AnsiballZ_copy.py'
Nov 25 06:16:05 compute-0 sudo[198957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:05 compute-0 python3.9[198959]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051365.149886-663-8609960338580/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:16:05 compute-0 sudo[198957]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:06 compute-0 sudo[199109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mespcuzijbfndcjztejhyvzxpcqgimiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051366.0635169-680-104058176284353/AnsiballZ_container_config_data.py'
Nov 25 06:16:06 compute-0 sudo[199109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:06 compute-0 python3.9[199111]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 25 06:16:06 compute-0 sudo[199109]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:06 compute-0 sudo[199261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmskkkcyiscypgxtmxzmurerbgujhawc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051366.5442297-689-87864895530208/AnsiballZ_container_config_hash.py'
Nov 25 06:16:06 compute-0 sudo[199261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:06 compute-0 python3.9[199263]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:16:06 compute-0 sudo[199261]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:07 compute-0 sudo[199413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfveucivrkarzjlqxdptyqyhaissenrp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051367.0758474-699-277520884920312/AnsiballZ_edpm_container_manage.py'
Nov 25 06:16:07 compute-0 sudo[199413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:07 compute-0 python3[199415]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:16:09 compute-0 podman[199427]: 2025-11-25 06:16:09.624309111 +0000 UTC m=+2.118632378 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 25 06:16:09 compute-0 podman[199508]: 2025-11-25 06:16:09.737278387 +0000 UTC m=+0.039121445 container create 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Nov 25 06:16:09 compute-0 podman[199508]: 2025-11-25 06:16:09.717182952 +0000 UTC m=+0.019026020 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 25 06:16:09 compute-0 python3[199415]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 25 06:16:09 compute-0 sudo[199413]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:10 compute-0 sudo[199685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmciiachtozcvfrmtlvfdqwqxqweydu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051369.9699192-707-246684866520869/AnsiballZ_stat.py'
Nov 25 06:16:10 compute-0 sudo[199685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:10 compute-0 python3.9[199687]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:16:10 compute-0 sudo[199685]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:10 compute-0 sudo[199839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxnwizpxsuolubmwydhvabxyiygnnllt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051370.5130281-716-212875570421075/AnsiballZ_file.py'
Nov 25 06:16:10 compute-0 sudo[199839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:10 compute-0 python3.9[199841]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:10 compute-0 sudo[199839]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:11 compute-0 podman[199894]: 2025-11-25 06:16:11.096322855 +0000 UTC m=+0.069797708 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 25 06:16:11 compute-0 sudo[200013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbkkedrcecqqwqqyxfuxuepkxtmjerfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051370.9052079-716-260938805992286/AnsiballZ_copy.py'
Nov 25 06:16:11 compute-0 sudo[200013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:11 compute-0 python3.9[200015]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764051370.9052079-716-260938805992286/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:11 compute-0 sudo[200013]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:11 compute-0 sudo[200089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxjranrmpvofvkpnbjkmkatifcxzonjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051370.9052079-716-260938805992286/AnsiballZ_systemd.py'
Nov 25 06:16:11 compute-0 sudo[200089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:11 compute-0 python3.9[200091]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:16:11 compute-0 systemd[1]: Reloading.
Nov 25 06:16:11 compute-0 systemd-sysv-generator[200113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:16:11 compute-0 systemd-rc-local-generator[200109]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:16:13 compute-0 sudo[200089]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:13 compute-0 sudo[200200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkhkxujishbnwblknkboktitwsrspelt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051370.9052079-716-260938805992286/AnsiballZ_systemd.py'
Nov 25 06:16:13 compute-0 sudo[200200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:13 compute-0 python3.9[200202]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:16:13 compute-0 systemd[1]: Reloading.
Nov 25 06:16:13 compute-0 systemd-sysv-generator[200228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:16:13 compute-0 systemd-rc-local-generator[200224]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:16:13 compute-0 systemd[1]: Starting podman_exporter container...
Nov 25 06:16:13 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:16:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa7aa0c951ce538823aa66c880aab29ac03d0243e87de2dc6ed756a092e121c0/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa7aa0c951ce538823aa66c880aab29ac03d0243e87de2dc6ed756a092e121c0/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:13 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.
Nov 25 06:16:13 compute-0 podman[200242]: 2025-11-25 06:16:13.948076313 +0000 UTC m=+0.091517965 container init 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:16:13 compute-0 podman_exporter[200254]: ts=2025-11-25T06:16:13.962Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 25 06:16:13 compute-0 podman_exporter[200254]: ts=2025-11-25T06:16:13.962Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 25 06:16:13 compute-0 podman_exporter[200254]: ts=2025-11-25T06:16:13.963Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 25 06:16:13 compute-0 podman_exporter[200254]: ts=2025-11-25T06:16:13.963Z caller=handler.go:105 level=info collector=container
Nov 25 06:16:13 compute-0 systemd[1]: Starting Podman API Service...
Nov 25 06:16:13 compute-0 podman[200242]: 2025-11-25 06:16:13.978812439 +0000 UTC m=+0.122254071 container start 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:16:13 compute-0 systemd[1]: Started Podman API Service.
Nov 25 06:16:13 compute-0 podman[200242]: podman_exporter
Nov 25 06:16:13 compute-0 systemd[1]: Started podman_exporter container.
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="Setting parallel job count to 13"
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="Using sqlite as database backend"
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 25 06:16:14 compute-0 podman[200265]: @ - - [25/Nov/2025:06:16:14 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 06:16:14 compute-0 sudo[200200]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:14 compute-0 podman[200265]: @ - - [25/Nov/2025:06:16:14 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 20138 "" "Go-http-client/1.1"
Nov 25 06:16:14 compute-0 podman_exporter[200254]: ts=2025-11-25T06:16:14.052Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 25 06:16:14 compute-0 podman[200264]: 2025-11-25 06:16:14.052931422 +0000 UTC m=+0.065923704 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:16:14 compute-0 podman_exporter[200254]: ts=2025-11-25T06:16:14.053Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 25 06:16:14 compute-0 podman_exporter[200254]: ts=2025-11-25T06:16:14.053Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 25 06:16:14 compute-0 systemd[1]: 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226-78549e98a7e33c2d.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 06:16:14 compute-0 systemd[1]: 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226-78549e98a7e33c2d.service: Failed with result 'exit-code'.
Nov 25 06:16:14 compute-0 sudo[200445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcbfonrsfozwfzsgocuthogpmoegnrym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051374.1560926-740-127840701630716/AnsiballZ_systemd.py'
Nov 25 06:16:14 compute-0 sudo[200445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:14 compute-0 python3.9[200447]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:16:14 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 25 06:16:14 compute-0 podman[200265]: @ - - [25/Nov/2025:06:16:14 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1641 "" "Go-http-client/1.1"
Nov 25 06:16:14 compute-0 systemd[1]: libpod-834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.scope: Deactivated successfully.
Nov 25 06:16:14 compute-0 podman[200451]: 2025-11-25 06:16:14.665709115 +0000 UTC m=+0.035509856 container died 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:16:14 compute-0 systemd[1]: 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226-78549e98a7e33c2d.timer: Deactivated successfully.
Nov 25 06:16:14 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.
Nov 25 06:16:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226-userdata-shm.mount: Deactivated successfully.
Nov 25 06:16:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-aa7aa0c951ce538823aa66c880aab29ac03d0243e87de2dc6ed756a092e121c0-merged.mount: Deactivated successfully.
Nov 25 06:16:14 compute-0 podman[200451]: 2025-11-25 06:16:14.828877751 +0000 UTC m=+0.198678481 container cleanup 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:16:14 compute-0 podman[200451]: podman_exporter
Nov 25 06:16:14 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 25 06:16:14 compute-0 podman[200473]: podman_exporter
Nov 25 06:16:14 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 25 06:16:14 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 25 06:16:14 compute-0 systemd[1]: Starting podman_exporter container...
Nov 25 06:16:14 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:16:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa7aa0c951ce538823aa66c880aab29ac03d0243e87de2dc6ed756a092e121c0/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa7aa0c951ce538823aa66c880aab29ac03d0243e87de2dc6ed756a092e121c0/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:14 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.
Nov 25 06:16:14 compute-0 podman[200482]: 2025-11-25 06:16:14.955354111 +0000 UTC m=+0.069713177 container init 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:16:14 compute-0 podman_exporter[200494]: ts=2025-11-25T06:16:14.964Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 25 06:16:14 compute-0 podman_exporter[200494]: ts=2025-11-25T06:16:14.964Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 25 06:16:14 compute-0 podman_exporter[200494]: ts=2025-11-25T06:16:14.964Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 25 06:16:14 compute-0 podman_exporter[200494]: ts=2025-11-25T06:16:14.964Z caller=handler.go:105 level=info collector=container
Nov 25 06:16:14 compute-0 podman[200265]: @ - - [25/Nov/2025:06:16:14 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 25 06:16:14 compute-0 podman[200265]: time="2025-11-25T06:16:14Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 06:16:14 compute-0 podman[200482]: 2025-11-25 06:16:14.97851758 +0000 UTC m=+0.092876647 container start 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:16:14 compute-0 podman[200482]: podman_exporter
Nov 25 06:16:14 compute-0 podman[200265]: @ - - [25/Nov/2025:06:16:14 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 20140 "" "Go-http-client/1.1"
Nov 25 06:16:14 compute-0 systemd[1]: Started podman_exporter container.
Nov 25 06:16:14 compute-0 podman_exporter[200494]: ts=2025-11-25T06:16:14.981Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 25 06:16:14 compute-0 podman_exporter[200494]: ts=2025-11-25T06:16:14.982Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 25 06:16:14 compute-0 podman_exporter[200494]: ts=2025-11-25T06:16:14.982Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 25 06:16:15 compute-0 sudo[200445]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:15 compute-0 podman[200504]: 2025-11-25 06:16:15.021117222 +0000 UTC m=+0.037390964 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:16:15 compute-0 sudo[200689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cilmypwxxirxgessdvkrsrubfyvuyyhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051375.1177406-748-48923694335414/AnsiballZ_stat.py'
Nov 25 06:16:15 compute-0 sudo[200689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:15 compute-0 podman[200648]: 2025-11-25 06:16:15.303944403 +0000 UTC m=+0.040952636 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 06:16:15 compute-0 python3.9[200694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:15 compute-0 sudo[200689]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:15 compute-0 sudo[200815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkvdeqdpbonkmwynmqhxrlugjrzldryv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051375.1177406-748-48923694335414/AnsiballZ_copy.py'
Nov 25 06:16:15 compute-0 sudo[200815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:15 compute-0 python3.9[200817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764051375.1177406-748-48923694335414/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 06:16:15 compute-0 sudo[200815]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.933 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.934 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.934 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.934 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.934 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.935 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.935 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.935 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:16:15 compute-0 nova_compute[186241]: 2025-11-25 06:16:15.935 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:16:16 compute-0 sudo[200967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogmajxymlcxistkzdlmwhgnnnapjqmuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051376.07009-765-40184261027589/AnsiballZ_container_config_data.py'
Nov 25 06:16:16 compute-0 sudo[200967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:16 compute-0 python3.9[200969]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 25 06:16:16 compute-0 sudo[200967]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.444 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.645 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.646 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6022MB free_disk=73.18881607055664GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.646 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:16:16 compute-0 nova_compute[186241]: 2025-11-25 06:16:16.646 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:16:16 compute-0 sudo[201119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqpjxsvgehmiqjynaotkqkfgbofqhdhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051376.6171424-774-84940277150695/AnsiballZ_container_config_hash.py'
Nov 25 06:16:16 compute-0 sudo[201119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:16 compute-0 python3.9[201121]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 06:16:16 compute-0 sudo[201119]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:17 compute-0 sudo[201271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcwhqymdjqkxeyovgpbbnilsibcbhwri ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051377.1877024-784-153797091312611/AnsiballZ_edpm_container_manage.py'
Nov 25 06:16:17 compute-0 sudo[201271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:17 compute-0 python3[201273]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 06:16:17 compute-0 nova_compute[186241]: 2025-11-25 06:16:17.677 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:16:17 compute-0 nova_compute[186241]: 2025-11-25 06:16:17.677 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:16:17 compute-0 nova_compute[186241]: 2025-11-25 06:16:17.698 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:16:18 compute-0 nova_compute[186241]: 2025-11-25 06:16:18.201 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:16:18 compute-0 nova_compute[186241]: 2025-11-25 06:16:18.203 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:16:18 compute-0 nova_compute[186241]: 2025-11-25 06:16:18.203 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:16:20 compute-0 podman[201283]: 2025-11-25 06:16:20.000770166 +0000 UTC m=+2.361415636 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 25 06:16:20 compute-0 podman[201364]: 2025-11-25 06:16:20.093633448 +0000 UTC m=+0.029208627 container create f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Nov 25 06:16:20 compute-0 podman[201364]: 2025-11-25 06:16:20.080319558 +0000 UTC m=+0.015894747 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 25 06:16:20 compute-0 python3[201273]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 25 06:16:20 compute-0 sudo[201271]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:20 compute-0 sudo[201542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neikxmwscccrpxizlucaxwabknwhnwju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051380.3080032-792-266003934280658/AnsiballZ_stat.py'
Nov 25 06:16:20 compute-0 sudo[201542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:20 compute-0 python3.9[201544]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:16:20 compute-0 sudo[201542]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:20 compute-0 sudo[201696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlyjkocbfsyzssvyytqueffjdxqrmhlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051380.8087184-801-72229280790736/AnsiballZ_file.py'
Nov 25 06:16:20 compute-0 sudo[201696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:21 compute-0 python3.9[201698]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:21 compute-0 sudo[201696]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:21 compute-0 auditd[672]: Audit daemon rotating log files
Nov 25 06:16:21 compute-0 sudo[201861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjkhybexqqnxhdsmdgvqatiynjhteewn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051381.176843-801-134221416738389/AnsiballZ_copy.py'
Nov 25 06:16:21 compute-0 sudo[201861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:21 compute-0 podman[201821]: 2025-11-25 06:16:21.469117609 +0000 UTC m=+0.039584468 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:16:21 compute-0 python3.9[201865]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764051381.176843-801-134221416738389/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:21 compute-0 sudo[201861]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:21 compute-0 sudo[201939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsaikamketkjcvsqkzqzfmfjwgwqwknk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051381.176843-801-134221416738389/AnsiballZ_systemd.py'
Nov 25 06:16:21 compute-0 sudo[201939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:22 compute-0 python3.9[201941]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 06:16:22 compute-0 systemd[1]: Reloading.
Nov 25 06:16:22 compute-0 systemd-rc-local-generator[201962]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:16:22 compute-0 systemd-sysv-generator[201965]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:16:22 compute-0 sudo[201939]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:22 compute-0 sudo[202050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgvwwfwrkkstetoamphzzpsosfqhhab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051381.176843-801-134221416738389/AnsiballZ_systemd.py'
Nov 25 06:16:22 compute-0 sudo[202050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:22 compute-0 python3.9[202052]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 06:16:22 compute-0 systemd[1]: Reloading.
Nov 25 06:16:22 compute-0 systemd-rc-local-generator[202075]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 06:16:22 compute-0 systemd-sysv-generator[202078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 06:16:22 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 25 06:16:23 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:16:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9371e1fc88e11a8e76e4ad24fb285d17ae90235f31fb2a363a5479071cdc3564/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9371e1fc88e11a8e76e4ad24fb285d17ae90235f31fb2a363a5479071cdc3564/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9371e1fc88e11a8e76e4ad24fb285d17ae90235f31fb2a363a5479071cdc3564/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.
Nov 25 06:16:23 compute-0 podman[202092]: 2025-11-25 06:16:23.070266177 +0000 UTC m=+0.087231543 container init f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *bridge.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *coverage.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *datapath.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *iface.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *memory.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *ovnnorthd.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *ovn.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *ovsdbserver.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *pmd_perf.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *pmd_rxq.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: INFO    06:16:23 main.go:48: registering *vswitch.Collector
Nov 25 06:16:23 compute-0 openstack_network_exporter[202105]: NOTICE  06:16:23 main.go:76: listening on https://:9105/metrics
Nov 25 06:16:23 compute-0 podman[202092]: 2025-11-25 06:16:23.096901072 +0000 UTC m=+0.113866417 container start f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 25 06:16:23 compute-0 podman[202092]: openstack_network_exporter
Nov 25 06:16:23 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 25 06:16:23 compute-0 sudo[202050]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:23 compute-0 podman[202115]: 2025-11-25 06:16:23.179311638 +0000 UTC m=+0.074731277 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 25 06:16:23 compute-0 sudo[202284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlbnfzjljawwghnfiaytjazibrchfyke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051383.2514749-825-260809830379032/AnsiballZ_systemd.py'
Nov 25 06:16:23 compute-0 sudo[202284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:23 compute-0 python3.9[202286]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 06:16:23 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 25 06:16:23 compute-0 systemd[1]: libpod-f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.scope: Deactivated successfully.
Nov 25 06:16:23 compute-0 conmon[202105]: conmon f3070edeeb1807616794 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.scope/container/memory.events
Nov 25 06:16:23 compute-0 podman[202290]: 2025-11-25 06:16:23.756961132 +0000 UTC m=+0.038441721 container died f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 25 06:16:23 compute-0 systemd[1]: f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d-308608a75b5e74dc.timer: Deactivated successfully.
Nov 25 06:16:23 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.
Nov 25 06:16:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d-userdata-shm.mount: Deactivated successfully.
Nov 25 06:16:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-9371e1fc88e11a8e76e4ad24fb285d17ae90235f31fb2a363a5479071cdc3564-merged.mount: Deactivated successfully.
Nov 25 06:16:24 compute-0 podman[202290]: 2025-11-25 06:16:24.328204278 +0000 UTC m=+0.609684868 container cleanup f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, release=1755695350, distribution-scope=public)
Nov 25 06:16:24 compute-0 podman[202290]: openstack_network_exporter
Nov 25 06:16:24 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 25 06:16:24 compute-0 podman[202312]: openstack_network_exporter
Nov 25 06:16:24 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 25 06:16:24 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 25 06:16:24 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 25 06:16:24 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9371e1fc88e11a8e76e4ad24fb285d17ae90235f31fb2a363a5479071cdc3564/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9371e1fc88e11a8e76e4ad24fb285d17ae90235f31fb2a363a5479071cdc3564/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9371e1fc88e11a8e76e4ad24fb285d17ae90235f31fb2a363a5479071cdc3564/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 06:16:24 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.
Nov 25 06:16:24 compute-0 podman[202322]: 2025-11-25 06:16:24.465690658 +0000 UTC m=+0.074362406 container init f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *bridge.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *coverage.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *datapath.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *iface.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *memory.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *ovnnorthd.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *ovn.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *ovsdbserver.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *pmd_perf.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *pmd_rxq.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: INFO    06:16:24 main.go:48: registering *vswitch.Collector
Nov 25 06:16:24 compute-0 openstack_network_exporter[202334]: NOTICE  06:16:24 main.go:76: listening on https://:9105/metrics
Nov 25 06:16:24 compute-0 podman[202322]: 2025-11-25 06:16:24.483258591 +0000 UTC m=+0.091930319 container start f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 06:16:24 compute-0 podman[202322]: openstack_network_exporter
Nov 25 06:16:24 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 25 06:16:24 compute-0 sudo[202284]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:24 compute-0 podman[202344]: 2025-11-25 06:16:24.53107534 +0000 UTC m=+0.040271682 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Nov 25 06:16:24 compute-0 sudo[202511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcuznvpojaydudkwdekxqylkpgxuaarr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051384.623568-833-114432303456333/AnsiballZ_find.py'
Nov 25 06:16:24 compute-0 sudo[202511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:24 compute-0 python3.9[202513]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 06:16:24 compute-0 sudo[202511]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:25 compute-0 sudo[202663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocgyfpymifefrhhnvagnlmkghxucsnxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051385.2588146-843-49964114556962/AnsiballZ_podman_container_info.py'
Nov 25 06:16:25 compute-0 sudo[202663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:25 compute-0 python3.9[202665]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 25 06:16:25 compute-0 sudo[202663]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:26 compute-0 sudo[202826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kenjesntbonenldtyqmfzhdhcccqfjdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051385.8872552-851-228453287600759/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:26 compute-0 sudo[202826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:26 compute-0 python3.9[202828]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:26 compute-0 systemd[1]: Started libpod-conmon-f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54.scope.
Nov 25 06:16:26 compute-0 podman[202829]: 2025-11-25 06:16:26.410424874 +0000 UTC m=+0.048823606 container exec f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:16:26 compute-0 podman[202845]: 2025-11-25 06:16:26.464507695 +0000 UTC m=+0.044608256 container exec_died f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 06:16:26 compute-0 podman[202829]: 2025-11-25 06:16:26.467441383 +0000 UTC m=+0.105840105 container exec_died f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:16:26 compute-0 systemd[1]: libpod-conmon-f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54.scope: Deactivated successfully.
Nov 25 06:16:26 compute-0 sudo[202826]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:26 compute-0 sudo[203003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqvjyjuplaofncpnueyulfvxgwrtzmkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051386.6190238-859-138556810198952/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:26 compute-0 sudo[203003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:26 compute-0 python3.9[203005]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:27 compute-0 systemd[1]: Started libpod-conmon-f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54.scope.
Nov 25 06:16:27 compute-0 podman[203006]: 2025-11-25 06:16:27.02041323 +0000 UTC m=+0.044241327 container exec f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:16:27 compute-0 podman[203006]: 2025-11-25 06:16:27.022994989 +0000 UTC m=+0.046823086 container exec_died f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 25 06:16:27 compute-0 sudo[203003]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:27 compute-0 systemd[1]: libpod-conmon-f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54.scope: Deactivated successfully.
Nov 25 06:16:27 compute-0 sudo[203180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oofhhdsthkespfobpykqvscyuzygrdie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051387.1734974-867-277813161697756/AnsiballZ_file.py'
Nov 25 06:16:27 compute-0 sudo[203180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:27 compute-0 python3.9[203182]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:27 compute-0 sudo[203180]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:27 compute-0 sudo[203332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilahbmgaboiqdbkrimccnixkyzvczydj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051387.67839-876-120498609718692/AnsiballZ_podman_container_info.py'
Nov 25 06:16:27 compute-0 sudo[203332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:28 compute-0 python3.9[203334]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 25 06:16:28 compute-0 sudo[203332]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:28 compute-0 sudo[203494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pubgwchmkarfqerekflxkoccfijtnjiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051388.191935-884-172305592098708/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:28 compute-0 sudo[203494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:28 compute-0 python3.9[203496]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:28 compute-0 systemd[1]: Started libpod-conmon-62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62.scope.
Nov 25 06:16:28 compute-0 podman[203497]: 2025-11-25 06:16:28.594444179 +0000 UTC m=+0.044258681 container exec 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 25 06:16:28 compute-0 podman[203513]: 2025-11-25 06:16:28.648501731 +0000 UTC m=+0.045698839 container exec_died 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent)
Nov 25 06:16:28 compute-0 podman[203497]: 2025-11-25 06:16:28.652685222 +0000 UTC m=+0.102499704 container exec_died 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 25 06:16:28 compute-0 systemd[1]: libpod-conmon-62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62.scope: Deactivated successfully.
Nov 25 06:16:28 compute-0 sudo[203494]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:28 compute-0 sudo[203682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hatlueuaisaondnbtjzxwcaunddrxufi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051388.7924497-892-166884693328284/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:28 compute-0 sudo[203682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:28 compute-0 podman[203646]: 2025-11-25 06:16:28.995928068 +0000 UTC m=+0.042747770 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:16:28 compute-0 systemd[1]: d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-6f546720d73c7e04.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 06:16:28 compute-0 systemd[1]: d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544-6f546720d73c7e04.service: Failed with result 'exit-code'.
Nov 25 06:16:29 compute-0 python3.9[203690]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:29 compute-0 systemd[1]: Started libpod-conmon-62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62.scope.
Nov 25 06:16:29 compute-0 podman[203691]: 2025-11-25 06:16:29.225665936 +0000 UTC m=+0.055726573 container exec 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:16:29 compute-0 podman[203691]: 2025-11-25 06:16:29.233520255 +0000 UTC m=+0.063580871 container exec_died 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 06:16:29 compute-0 sudo[203682]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:29 compute-0 systemd[1]: libpod-conmon-62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62.scope: Deactivated successfully.
Nov 25 06:16:29 compute-0 sudo[203865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyehyyxvxexttkxdlbalxdkttjrbizuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051389.3731315-900-261446594836227/AnsiballZ_file.py'
Nov 25 06:16:29 compute-0 sudo[203865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:29 compute-0 python3.9[203867]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:29 compute-0 sudo[203865]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:30 compute-0 sudo[204017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikhgpxvrottocwctlddgjmbltrhsuibo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051389.8637066-909-11019704191865/AnsiballZ_podman_container_info.py'
Nov 25 06:16:30 compute-0 sudo[204017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:30 compute-0 python3.9[204019]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 25 06:16:30 compute-0 sudo[204017]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:30 compute-0 sudo[204179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdqufhawvoyswjkzvfvagbnwfbefgcsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051390.3674822-917-202961167874541/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:30 compute-0 sudo[204179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:30 compute-0 python3.9[204181]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:30 compute-0 systemd[1]: Started libpod-conmon-36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.scope.
Nov 25 06:16:30 compute-0 podman[204182]: 2025-11-25 06:16:30.777488076 +0000 UTC m=+0.047063648 container exec 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:16:30 compute-0 podman[204198]: 2025-11-25 06:16:30.831469658 +0000 UTC m=+0.044378907 container exec_died 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 25 06:16:30 compute-0 podman[204182]: 2025-11-25 06:16:30.835100832 +0000 UTC m=+0.104676404 container exec_died 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:16:30 compute-0 systemd[1]: libpod-conmon-36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.scope: Deactivated successfully.
Nov 25 06:16:30 compute-0 sudo[204179]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:31 compute-0 sudo[204357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbrugrrwkbjbnbndhpnnjhuassyxwsuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051390.978019-925-98232473655073/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:31 compute-0 sudo[204357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:31 compute-0 python3.9[204359]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:31 compute-0 systemd[1]: Started libpod-conmon-36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.scope.
Nov 25 06:16:31 compute-0 podman[204360]: 2025-11-25 06:16:31.378710074 +0000 UTC m=+0.049927605 container exec 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:16:31 compute-0 podman[204375]: 2025-11-25 06:16:31.430473519 +0000 UTC m=+0.043384343 container exec_died 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 25 06:16:31 compute-0 podman[204360]: 2025-11-25 06:16:31.432754233 +0000 UTC m=+0.103971775 container exec_died 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 06:16:31 compute-0 systemd[1]: libpod-conmon-36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd.scope: Deactivated successfully.
Nov 25 06:16:31 compute-0 sudo[204357]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:31 compute-0 sudo[204534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onmjvjdwhvevxcsjgjilkxoonpbymggs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051391.5782125-933-51681017000956/AnsiballZ_file.py'
Nov 25 06:16:31 compute-0 sudo[204534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:31 compute-0 python3.9[204536]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:31 compute-0 sudo[204534]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:32 compute-0 sudo[204686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unnqbahsxuwnhnrkdmmsyzuwpubirmwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051392.0747244-942-104689933742507/AnsiballZ_podman_container_info.py'
Nov 25 06:16:32 compute-0 sudo[204686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:32 compute-0 python3.9[204688]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 25 06:16:32 compute-0 sudo[204686]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:32 compute-0 sudo[204847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kflmtmjeldielmjtxcsmzbhngubaptod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051392.5867133-950-258363044443114/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:32 compute-0 sudo[204847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:32 compute-0 python3.9[204849]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:32 compute-0 systemd[1]: Started libpod-conmon-d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.scope.
Nov 25 06:16:32 compute-0 podman[204850]: 2025-11-25 06:16:32.978614049 +0000 UTC m=+0.042888103 container exec d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 06:16:32 compute-0 podman[204850]: 2025-11-25 06:16:32.984561405 +0000 UTC m=+0.048835459 container exec_died d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 25 06:16:33 compute-0 sudo[204847]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:33 compute-0 systemd[1]: libpod-conmon-d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.scope: Deactivated successfully.
Nov 25 06:16:33 compute-0 sudo[205025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boiisshhdcnsdklaxwgrfzrlewvtzpax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051393.133204-958-234261240551275/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:33 compute-0 sudo[205025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:33 compute-0 python3.9[205027]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:33 compute-0 systemd[1]: Started libpod-conmon-d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.scope.
Nov 25 06:16:33 compute-0 podman[205028]: 2025-11-25 06:16:33.524523161 +0000 UTC m=+0.046871789 container exec d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 06:16:33 compute-0 podman[205043]: 2025-11-25 06:16:33.57747914 +0000 UTC m=+0.043819468 container exec_died d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 25 06:16:33 compute-0 podman[205028]: 2025-11-25 06:16:33.580553091 +0000 UTC m=+0.102901718 container exec_died d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:16:33 compute-0 systemd[1]: libpod-conmon-d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544.scope: Deactivated successfully.
Nov 25 06:16:33 compute-0 sudo[205025]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:33 compute-0 sudo[205203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqnskputguotnnkgyfnehtujyvfclqyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051393.7092774-966-260651679559011/AnsiballZ_file.py'
Nov 25 06:16:33 compute-0 sudo[205203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:34 compute-0 python3.9[205205]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:34 compute-0 sudo[205203]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:34 compute-0 sudo[205355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abzqkmbpwfmomclmnwbcnyncrzfkbgio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051394.1914005-975-275569009579034/AnsiballZ_podman_container_info.py'
Nov 25 06:16:34 compute-0 sudo[205355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:34 compute-0 python3.9[205357]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 25 06:16:34 compute-0 sudo[205355]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:34 compute-0 sudo[205516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckjfxhkceoizgabgieulhmxvgudsahmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051394.6774573-983-69161743993664/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:34 compute-0 sudo[205516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:35 compute-0 python3.9[205518]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:35 compute-0 systemd[1]: Started libpod-conmon-0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.scope.
Nov 25 06:16:35 compute-0 podman[205519]: 2025-11-25 06:16:35.064472872 +0000 UTC m=+0.042501839 container exec 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:16:35 compute-0 podman[205535]: 2025-11-25 06:16:35.115523831 +0000 UTC m=+0.043019730 container exec_died 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:16:35 compute-0 podman[205519]: 2025-11-25 06:16:35.118271761 +0000 UTC m=+0.096300718 container exec_died 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:16:35 compute-0 systemd[1]: libpod-conmon-0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.scope: Deactivated successfully.
Nov 25 06:16:35 compute-0 sudo[205516]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:35 compute-0 podman[205544]: 2025-11-25 06:16:35.18989891 +0000 UTC m=+0.044618446 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:16:35 compute-0 sudo[205714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guowtfldiztfgztrdnvyxuqkiceoujjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051395.256984-991-86584169574825/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:35 compute-0 sudo[205714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:35 compute-0 python3.9[205716]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:35 compute-0 systemd[1]: Started libpod-conmon-0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.scope.
Nov 25 06:16:35 compute-0 podman[205717]: 2025-11-25 06:16:35.656125808 +0000 UTC m=+0.045762359 container exec 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:16:35 compute-0 podman[205733]: 2025-11-25 06:16:35.708447057 +0000 UTC m=+0.043388259 container exec_died 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:16:35 compute-0 podman[205717]: 2025-11-25 06:16:35.71228555 +0000 UTC m=+0.101922091 container exec_died 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:16:35 compute-0 systemd[1]: libpod-conmon-0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3.scope: Deactivated successfully.
Nov 25 06:16:35 compute-0 sudo[205714]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:36 compute-0 sudo[205892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cceqvuwhhprdngnjwnzffeuqxxzaphip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051395.8430283-999-277401250324083/AnsiballZ_file.py'
Nov 25 06:16:36 compute-0 sudo[205892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:36 compute-0 python3.9[205894]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:36 compute-0 sudo[205892]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:36 compute-0 sudo[206044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwtdqcmkftdkuxaqwtbmdfuubfoyjjmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051396.3301976-1008-182989246532508/AnsiballZ_podman_container_info.py'
Nov 25 06:16:36 compute-0 sudo[206044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:36 compute-0 python3.9[206046]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 25 06:16:36 compute-0 sudo[206044]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:36 compute-0 sudo[206205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xycrgxrztvsikdfmghznqvzbzjjayftn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051396.8047478-1016-62708200853731/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:36 compute-0 sudo[206205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:37 compute-0 python3.9[206207]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:37 compute-0 systemd[1]: Started libpod-conmon-834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.scope.
Nov 25 06:16:37 compute-0 podman[206208]: 2025-11-25 06:16:37.186616252 +0000 UTC m=+0.044055701 container exec 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:16:37 compute-0 podman[206224]: 2025-11-25 06:16:37.246504252 +0000 UTC m=+0.050995937 container exec_died 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:16:37 compute-0 podman[206208]: 2025-11-25 06:16:37.249375954 +0000 UTC m=+0.106815392 container exec_died 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:16:37 compute-0 systemd[1]: libpod-conmon-834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.scope: Deactivated successfully.
Nov 25 06:16:37 compute-0 sudo[206205]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:37 compute-0 sudo[206383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clvxklxaydkpyzlsyhkojvpfjmibefcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051397.3819726-1024-131256165744972/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:37 compute-0 sudo[206383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:37 compute-0 python3.9[206385]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:37 compute-0 systemd[1]: Started libpod-conmon-834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.scope.
Nov 25 06:16:37 compute-0 podman[206386]: 2025-11-25 06:16:37.771364279 +0000 UTC m=+0.045480522 container exec 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:16:37 compute-0 podman[206402]: 2025-11-25 06:16:37.82351609 +0000 UTC m=+0.043352763 container exec_died 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:16:37 compute-0 podman[206386]: 2025-11-25 06:16:37.826345442 +0000 UTC m=+0.100461665 container exec_died 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:16:37 compute-0 systemd[1]: libpod-conmon-834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226.scope: Deactivated successfully.
Nov 25 06:16:37 compute-0 sudo[206383]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:38 compute-0 sudo[206560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpqwdwfynixzckvkmzkcsliywlgdvver ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051397.9716053-1032-147293417323791/AnsiballZ_file.py'
Nov 25 06:16:38 compute-0 sudo[206560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:38 compute-0 python3.9[206562]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:38 compute-0 sudo[206560]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:38 compute-0 sudo[206712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etgzgpizkguzhkjzwoggxchfxvexzijt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051398.4638493-1041-46135384403324/AnsiballZ_podman_container_info.py'
Nov 25 06:16:38 compute-0 sudo[206712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:38 compute-0 python3.9[206714]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 25 06:16:38 compute-0 sudo[206712]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:39 compute-0 sudo[206875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtcrsqqlbzjljggsofmxmdipnpmnxaik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051398.9628994-1049-105929432182611/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:39 compute-0 sudo[206875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:39 compute-0 python3.9[206877]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:39 compute-0 systemd[1]: Started libpod-conmon-f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.scope.
Nov 25 06:16:39 compute-0 podman[206878]: 2025-11-25 06:16:39.349154883 +0000 UTC m=+0.041980272 container exec f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git)
Nov 25 06:16:39 compute-0 podman[206894]: 2025-11-25 06:16:39.403525623 +0000 UTC m=+0.044410946 container exec_died f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Nov 25 06:16:39 compute-0 podman[206878]: 2025-11-25 06:16:39.406425036 +0000 UTC m=+0.099250406 container exec_died f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm)
Nov 25 06:16:39 compute-0 systemd[1]: libpod-conmon-f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.scope: Deactivated successfully.
Nov 25 06:16:39 compute-0 sudo[206875]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:39 compute-0 sudo[207053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiplblqpofmpinxpenqtcglpdyqwazcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051399.5517926-1057-42758373635216/AnsiballZ_podman_container_exec.py'
Nov 25 06:16:39 compute-0 sudo[207053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:39 compute-0 python3.9[207055]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 06:16:39 compute-0 systemd[1]: Started libpod-conmon-f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.scope.
Nov 25 06:16:39 compute-0 podman[207056]: 2025-11-25 06:16:39.947063011 +0000 UTC m=+0.040534803 container exec f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 25 06:16:39 compute-0 podman[207072]: 2025-11-25 06:16:39.996544509 +0000 UTC m=+0.040449733 container exec_died f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 06:16:39 compute-0 podman[207056]: 2025-11-25 06:16:39.999006011 +0000 UTC m=+0.092477804 container exec_died f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350)
Nov 25 06:16:40 compute-0 systemd[1]: libpod-conmon-f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d.scope: Deactivated successfully.
Nov 25 06:16:40 compute-0 sudo[207053]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:40 compute-0 sudo[207231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udhdodttvwvnnerldlkwwreauogypulc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051400.141756-1065-215670214000633/AnsiballZ_file.py'
Nov 25 06:16:40 compute-0 sudo[207231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:40 compute-0 python3.9[207233]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:40 compute-0 sudo[207231]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:40 compute-0 sudo[207383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipwwkrelrendpnorwjbmnjdaxymaemdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051400.643795-1074-709032103041/AnsiballZ_file.py'
Nov 25 06:16:40 compute-0 sudo[207383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:40 compute-0 python3.9[207385]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:40 compute-0 sudo[207383]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:41 compute-0 sudo[207544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lncaaqlzffticiwpykqbbynfapukoxvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051401.1018279-1082-247534255801885/AnsiballZ_stat.py'
Nov 25 06:16:41 compute-0 sudo[207544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:41 compute-0 podman[207509]: 2025-11-25 06:16:41.330902812 +0000 UTC m=+0.063279918 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:16:41 compute-0 python3.9[207552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:41 compute-0 sudo[207544]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:41 compute-0 sudo[207682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vscsewtiiqbyyixbotczymtsrvzgfrzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051401.1018279-1082-247534255801885/AnsiballZ_copy.py'
Nov 25 06:16:41 compute-0 sudo[207682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:41 compute-0 python3.9[207684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764051401.1018279-1082-247534255801885/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:41 compute-0 sudo[207682]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:42 compute-0 sudo[207834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jysfpcwrsrtxrmhbbnwctlopgrijdkne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051402.011759-1098-199511554540922/AnsiballZ_file.py'
Nov 25 06:16:42 compute-0 sudo[207834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:42 compute-0 python3.9[207836]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:42 compute-0 sudo[207834]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:42 compute-0 sudo[207986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilhhktzcwupulqmwfxwlllmuxqxbjzmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051402.4459858-1106-211870995947970/AnsiballZ_stat.py'
Nov 25 06:16:42 compute-0 sudo[207986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:42 compute-0 python3.9[207988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:42 compute-0 sudo[207986]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:42 compute-0 sudo[208064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itjxcjgwaguiwwjwcmovrpemtzrmcrgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051402.4459858-1106-211870995947970/AnsiballZ_file.py'
Nov 25 06:16:42 compute-0 sudo[208064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:43 compute-0 python3.9[208066]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:43 compute-0 sudo[208064]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:43 compute-0 sudo[208216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydrsgorzbbfwelsgazmgptpekuewejzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051403.2132504-1118-96169747253754/AnsiballZ_stat.py'
Nov 25 06:16:43 compute-0 sudo[208216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:43 compute-0 python3.9[208218]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:43 compute-0 sudo[208216]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:43 compute-0 sudo[208294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtgcbychaqlsjoazodkomzgryebqxywb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051403.2132504-1118-96169747253754/AnsiballZ_file.py'
Nov 25 06:16:43 compute-0 sudo[208294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:43 compute-0 python3.9[208296]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.bduxnbk0 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:43 compute-0 sudo[208294]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:44 compute-0 sudo[208446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdzvvtxevmvmcgntpetftshofurvscue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051403.964486-1130-210854434439271/AnsiballZ_stat.py'
Nov 25 06:16:44 compute-0 sudo[208446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:44 compute-0 python3.9[208448]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:44 compute-0 sudo[208446]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:44 compute-0 sudo[208524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roucwzgcgvzbksrwlzatpmuxrsgfyojc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051403.964486-1130-210854434439271/AnsiballZ_file.py'
Nov 25 06:16:44 compute-0 sudo[208524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:44 compute-0 python3.9[208526]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:44 compute-0 sudo[208524]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:44 compute-0 sudo[208676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enccaehkfenhmmclfnteszxkmegehpzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051404.7530744-1143-155673464097080/AnsiballZ_command.py'
Nov 25 06:16:44 compute-0 sudo[208676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:45 compute-0 python3.9[208678]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:16:45 compute-0 sudo[208676]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:45 compute-0 sudo[208855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrbnnkdyxwwhqnmhvosuoahatjrobtfe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764051405.2077484-1151-192651716381868/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 06:16:45 compute-0 podman[208804]: 2025-11-25 06:16:45.561360667 +0000 UTC m=+0.040091323 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:16:45 compute-0 sudo[208855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:45 compute-0 podman[208803]: 2025-11-25 06:16:45.598971679 +0000 UTC m=+0.079234125 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 25 06:16:45 compute-0 python3[208869]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 06:16:45 compute-0 sudo[208855]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:46 compute-0 sudo[209019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvdwtkbbltiweuulfybggymzwtxagzve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051405.8644593-1159-252098914974566/AnsiballZ_stat.py'
Nov 25 06:16:46 compute-0 sudo[209019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:46 compute-0 python3.9[209021]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:46 compute-0 sudo[209019]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:46 compute-0 sudo[209097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ichxntnhjhzwuljxthoybuibegliockf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051405.8644593-1159-252098914974566/AnsiballZ_file.py'
Nov 25 06:16:46 compute-0 sudo[209097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:46 compute-0 python3.9[209099]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:46 compute-0 sudo[209097]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:46 compute-0 sudo[209249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynseccyykcpvreziaiwxugcyqeezqtto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051406.6483164-1171-75349527276313/AnsiballZ_stat.py'
Nov 25 06:16:46 compute-0 sudo[209249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:46 compute-0 python3.9[209251]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:47 compute-0 sudo[209249]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:47 compute-0 sudo[209327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxmuuqwjzaofbidimneyknwhymrweuzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051406.6483164-1171-75349527276313/AnsiballZ_file.py'
Nov 25 06:16:47 compute-0 sudo[209327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:16:47.188 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:16:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:16:47.189 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:16:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:16:47.189 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:16:47 compute-0 python3.9[209329]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:47 compute-0 sudo[209327]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:47 compute-0 sudo[209480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psoxmulouufnzzlukmjhdaerzvioooqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051407.4379358-1183-196156732419929/AnsiballZ_stat.py'
Nov 25 06:16:47 compute-0 sudo[209480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:47 compute-0 python3.9[209482]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:47 compute-0 sudo[209480]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:47 compute-0 sudo[209558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abucflylsfimqbloolfsqxuofduhebpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051407.4379358-1183-196156732419929/AnsiballZ_file.py'
Nov 25 06:16:47 compute-0 sudo[209558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:48 compute-0 python3.9[209560]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:48 compute-0 sudo[209558]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:48 compute-0 sudo[209710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foqwndwjrogjtfwxsufxnewgzuhuxkgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051408.2177303-1195-120931146650153/AnsiballZ_stat.py'
Nov 25 06:16:48 compute-0 sudo[209710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:48 compute-0 python3.9[209712]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:48 compute-0 sudo[209710]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:48 compute-0 sudo[209788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlgbulvggzfppkkksitijbdfnjjnkvsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051408.2177303-1195-120931146650153/AnsiballZ_file.py'
Nov 25 06:16:48 compute-0 sudo[209788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:48 compute-0 python3.9[209790]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:48 compute-0 sudo[209788]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:49 compute-0 sudo[209940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cozhwmyxwkgyeiqtazoblxugwsqlfnhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051408.9900224-1207-168011090245417/AnsiballZ_stat.py'
Nov 25 06:16:49 compute-0 sudo[209940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:49 compute-0 python3.9[209942]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 06:16:49 compute-0 sudo[209940]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:49 compute-0 sudo[210065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iojmrifrwmkaefilexjspfrgdswrsxsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051408.9900224-1207-168011090245417/AnsiballZ_copy.py'
Nov 25 06:16:49 compute-0 sudo[210065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:49 compute-0 python3.9[210067]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764051408.9900224-1207-168011090245417/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:49 compute-0 sudo[210065]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:50 compute-0 sudo[210217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkhvjmejartwprsdiyshlwstoutadiiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051409.888324-1222-168652343823135/AnsiballZ_file.py'
Nov 25 06:16:50 compute-0 sudo[210217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:50 compute-0 python3.9[210219]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:50 compute-0 sudo[210217]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:50 compute-0 sudo[210369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfhreqolztdzlzgqxvlqkchfuybehhyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051410.3419094-1230-204392771839536/AnsiballZ_command.py'
Nov 25 06:16:50 compute-0 sudo[210369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:50 compute-0 python3.9[210371]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:16:50 compute-0 sudo[210369]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:51 compute-0 sudo[210524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjenowydsthnzcnwnpvdkobqpnlmydqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051410.803503-1238-163131287154588/AnsiballZ_blockinfile.py'
Nov 25 06:16:51 compute-0 sudo[210524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:51 compute-0 python3.9[210526]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:51 compute-0 sudo[210524]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:51 compute-0 sudo[210686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncfvsfzotpnecpyadaehyzegdyqtoosj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051411.4715886-1247-70297836460151/AnsiballZ_command.py'
Nov 25 06:16:51 compute-0 sudo[210686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:51 compute-0 podman[210650]: 2025-11-25 06:16:51.656983423 +0000 UTC m=+0.038940785 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 06:16:51 compute-0 python3.9[210694]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:16:51 compute-0 sudo[210686]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:52 compute-0 sudo[210845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajwhkhtssnhdkkmkyfptmmasmfwsyeyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051411.9398928-1255-61254842450157/AnsiballZ_stat.py'
Nov 25 06:16:52 compute-0 sudo[210845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:52 compute-0 python3.9[210847]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 06:16:52 compute-0 sudo[210845]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:52 compute-0 sudo[210999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkdbydmfbcexymsypflqwtwtmvklrvqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051412.3971584-1263-232502048955126/AnsiballZ_command.py'
Nov 25 06:16:52 compute-0 sudo[210999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:52 compute-0 python3.9[211001]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 06:16:52 compute-0 sudo[210999]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:53 compute-0 sudo[211154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezffmqzjxqwnefjjdornjuoeqneqneoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764051412.8882964-1271-105146933908527/AnsiballZ_file.py'
Nov 25 06:16:53 compute-0 sudo[211154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:16:53 compute-0 python3.9[211156]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 06:16:53 compute-0 sudo[211154]: pam_unix(sudo:session): session closed for user root
Nov 25 06:16:53 compute-0 sshd-session[186608]: Connection closed by 192.168.122.30 port 38384
Nov 25 06:16:53 compute-0 sshd-session[186605]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:16:53 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 25 06:16:53 compute-0 systemd[1]: session-25.scope: Consumed 1min 7.872s CPU time.
Nov 25 06:16:53 compute-0 systemd-logind[744]: Session 25 logged out. Waiting for processes to exit.
Nov 25 06:16:53 compute-0 systemd-logind[744]: Removed session 25.
Nov 25 06:16:55 compute-0 podman[211181]: 2025-11-25 06:16:55.061933329 +0000 UTC m=+0.040438463 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 06:16:59 compute-0 podman[211199]: 2025-11-25 06:16:59.063092148 +0000 UTC m=+0.043243610 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible)
Nov 25 06:17:06 compute-0 podman[211216]: 2025-11-25 06:17:06.081884783 +0000 UTC m=+0.061542110 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:17:12 compute-0 podman[211237]: 2025-11-25 06:17:12.081970925 +0000 UTC m=+0.057031760 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 06:17:16 compute-0 podman[211259]: 2025-11-25 06:17:16.062754361 +0000 UTC m=+0.041210250 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 25 06:17:16 compute-0 podman[211260]: 2025-11-25 06:17:16.063042069 +0000 UTC m=+0.039971257 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.196 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.196 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.704 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.704 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.705 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.705 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.705 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.705 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.705 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:17:18 compute-0 nova_compute[186241]: 2025-11-25 06:17:18.705 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.216 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.216 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.216 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.216 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.396 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.398 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6064MB free_disk=73.05401992797852GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.398 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:17:19 compute-0 nova_compute[186241]: 2025-11-25 06:17:19.398 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:17:20 compute-0 nova_compute[186241]: 2025-11-25 06:17:20.433 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:17:20 compute-0 nova_compute[186241]: 2025-11-25 06:17:20.434 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:17:20 compute-0 nova_compute[186241]: 2025-11-25 06:17:20.450 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:17:20 compute-0 nova_compute[186241]: 2025-11-25 06:17:20.953 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:17:20 compute-0 nova_compute[186241]: 2025-11-25 06:17:20.954 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:17:20 compute-0 nova_compute[186241]: 2025-11-25 06:17:20.955 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:17:22 compute-0 podman[211297]: 2025-11-25 06:17:22.083056267 +0000 UTC m=+0.062321461 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:17:25 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:25.007 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:17:25 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:25.008 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:17:25 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:25.010 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:17:25 compute-0 podman[211314]: 2025-11-25 06:17:25.570851144 +0000 UTC m=+0.041478431 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 06:17:30 compute-0 podman[211333]: 2025-11-25 06:17:30.058056313 +0000 UTC m=+0.037725689 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.099 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:a3:43 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-57c5a55b-f0e5-4312-b2f1-a38530c8c9f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57c5a55b-f0e5-4312-b2f1-a38530c8c9f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '569b0ed2b3cc4372897b86d284219992', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b0aab03-58b1-4261-966f-21fad50c4053, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3325e9f8-11fb-402b-b4b7-e97c61597f68) old=Port_Binding(mac=['fa:16:3e:62:a3:43'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-57c5a55b-f0e5-4312-b2f1-a38530c8c9f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57c5a55b-f0e5-4312-b2f1-a38530c8c9f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '569b0ed2b3cc4372897b86d284219992', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.100 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3325e9f8-11fb-402b-b4b7-e97c61597f68 in datapath 57c5a55b-f0e5-4312-b2f1-a38530c8c9f1 updated
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.101 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 57c5a55b-f0e5-4312-b2f1-a38530c8c9f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.102 103953 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpe238uiyp/privsep.sock']
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.612 103953 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.613 103953 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpe238uiyp/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:366
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.538 211354 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.541 211354 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.543 211354 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.543 211354 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211354
Nov 25 06:17:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:30.614 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3e92753c-09f9-42d8-a119-9faa836e93a0]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:17:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:31.009 211354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:17:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:31.009 211354 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:17:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:31.010 211354 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:17:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:31.356 211354 INFO oslo_service.backend [-] Loading backend: eventlet
Nov 25 06:17:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:31.360 211354 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Nov 25 06:17:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:31.391 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7beca5a4-39d3-4733-a60e-911bc0acd94b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:17:37 compute-0 podman[211361]: 2025-11-25 06:17:37.054951302 +0000 UTC m=+0.033902154 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:17:43 compute-0 podman[211382]: 2025-11-25 06:17:43.105850308 +0000 UTC m=+0.084937003 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 25 06:17:47 compute-0 podman[211406]: 2025-11-25 06:17:47.064973723 +0000 UTC m=+0.038367681 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:17:47 compute-0 podman[211405]: 2025-11-25 06:17:47.06752859 +0000 UTC m=+0.042005659 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 25 06:17:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:47.218 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:17:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:47.219 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:17:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:17:47.219 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:17:53 compute-0 podman[211444]: 2025-11-25 06:17:53.058904099 +0000 UTC m=+0.037804607 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 06:17:56 compute-0 podman[211460]: 2025-11-25 06:17:56.061970054 +0000 UTC m=+0.041980441 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.548 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.549 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.549 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.549 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.549 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.549 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.549 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:17:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:17:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:18:01 compute-0 podman[211478]: 2025-11-25 06:18:01.08790002 +0000 UTC m=+0.067071723 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:18:08 compute-0 podman[211496]: 2025-11-25 06:18:08.054059591 +0000 UTC m=+0.034552432 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:18:14 compute-0 podman[211517]: 2025-11-25 06:18:14.07699473 +0000 UTC m=+0.056249564 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 25 06:18:18 compute-0 podman[211541]: 2025-11-25 06:18:18.059956794 +0000 UTC m=+0.035937702 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:18:18 compute-0 podman[211540]: 2025-11-25 06:18:18.063372042 +0000 UTC m=+0.042107111 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.957 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.957 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.957 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.957 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.957 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.957 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.958 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.958 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:18:20 compute-0 nova_compute[186241]: 2025-11-25 06:18:20.958 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.479 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.479 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.479 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.480 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.657 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.657 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6002MB free_disk=73.05401992797852GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.658 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:21 compute-0 nova_compute[186241]: 2025-11-25 06:18:21.658 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:22 compute-0 nova_compute[186241]: 2025-11-25 06:18:22.698 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:18:22 compute-0 nova_compute[186241]: 2025-11-25 06:18:22.698 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:18:22 compute-0 nova_compute[186241]: 2025-11-25 06:18:22.735 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:18:23 compute-0 nova_compute[186241]: 2025-11-25 06:18:23.239 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:18:23 compute-0 nova_compute[186241]: 2025-11-25 06:18:23.241 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:18:23 compute-0 nova_compute[186241]: 2025-11-25 06:18:23.241 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:18:24 compute-0 podman[211578]: 2025-11-25 06:18:24.079899529 +0000 UTC m=+0.059470356 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 06:18:27 compute-0 podman[211594]: 2025-11-25 06:18:27.058972066 +0000 UTC m=+0.039333003 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 25 06:18:28 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:28.118 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:18:28 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:28.119 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:18:28 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:28.121 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:18:32 compute-0 podman[211614]: 2025-11-25 06:18:32.060908702 +0000 UTC m=+0.039755137 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 06:18:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:36.189 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:af:8b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6fb2701-3644-4fc1-81d0-634fa89abf62, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=41d1b45c-dacf-4079-b06f-ab644147f8e7) old=Port_Binding(mac=['fa:16:3e:78:af:8b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:18:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:36.190 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 41d1b45c-dacf-4079-b06f-ab644147f8e7 in datapath 0e4c5e99-aead-49a3-910e-5959edf0d03a updated
Nov 25 06:18:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:36.191 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e4c5e99-aead-49a3-910e-5959edf0d03a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:18:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:36.192 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b97807dd-f0e8-4494-b29e-47657c75bc3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:18:39 compute-0 podman[211631]: 2025-11-25 06:18:39.086905618 +0000 UTC m=+0.066769596 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:18:45 compute-0 podman[211652]: 2025-11-25 06:18:45.076055021 +0000 UTC m=+0.055134882 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 06:18:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:47.279 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:47.280 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:18:47.280 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:18:49 compute-0 podman[211678]: 2025-11-25 06:18:49.071922992 +0000 UTC m=+0.048353849 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:18:49 compute-0 podman[211677]: 2025-11-25 06:18:49.088868638 +0000 UTC m=+0.068016728 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 25 06:18:51 compute-0 nova_compute[186241]: 2025-11-25 06:18:51.460 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:51 compute-0 nova_compute[186241]: 2025-11-25 06:18:51.461 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:51 compute-0 nova_compute[186241]: 2025-11-25 06:18:51.963 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:18:52 compute-0 nova_compute[186241]: 2025-11-25 06:18:52.491 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:52 compute-0 nova_compute[186241]: 2025-11-25 06:18:52.491 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:52 compute-0 nova_compute[186241]: 2025-11-25 06:18:52.495 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:18:52 compute-0 nova_compute[186241]: 2025-11-25 06:18:52.496 186245 INFO nova.compute.claims [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:18:53 compute-0 nova_compute[186241]: 2025-11-25 06:18:53.530 186245 DEBUG nova.compute.provider_tree [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:18:54 compute-0 nova_compute[186241]: 2025-11-25 06:18:54.034 186245 DEBUG nova.scheduler.client.report [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:18:54 compute-0 nova_compute[186241]: 2025-11-25 06:18:54.538 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:18:54 compute-0 nova_compute[186241]: 2025-11-25 06:18:54.539 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:18:55 compute-0 nova_compute[186241]: 2025-11-25 06:18:55.044 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:18:55 compute-0 nova_compute[186241]: 2025-11-25 06:18:55.044 186245 DEBUG nova.network.neutron [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:18:55 compute-0 podman[211716]: 2025-11-25 06:18:55.085949035 +0000 UTC m=+0.066234467 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:18:55 compute-0 nova_compute[186241]: 2025-11-25 06:18:55.550 186245 INFO nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:18:56 compute-0 nova_compute[186241]: 2025-11-25 06:18:56.054 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:18:56 compute-0 nova_compute[186241]: 2025-11-25 06:18:56.115 186245 DEBUG nova.policy [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.066 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.067 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.067 186245 INFO nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Creating image(s)
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.068 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.068 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.069 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.069 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.070 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:57 compute-0 nova_compute[186241]: 2025-11-25 06:18:57.457 186245 DEBUG nova.network.neutron [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Successfully created port: b0597686-1f09-4b3d-ad11-27d3fbbdde6c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:18:58 compute-0 podman[211732]: 2025-11-25 06:18:58.072329017 +0000 UTC m=+0.047300202 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 06:18:58 compute-0 nova_compute[186241]: 2025-11-25 06:18:58.559 186245 DEBUG nova.network.neutron [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Successfully updated port: b0597686-1f09-4b3d-ad11-27d3fbbdde6c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.025 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.029 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.029 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.063 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.064 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.065 186245 DEBUG nova.network.neutron [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.077 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.part --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.078 186245 DEBUG nova.virt.images [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] 5215c26e-be2f-40b4-ac47-476bfa3cf3f2 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:278
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.079 186245 DEBUG nova.privsep.utils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.079 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.part /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.140 186245 DEBUG nova.compute.manager [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-changed-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.141 186245 DEBUG nova.compute.manager [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Refreshing instance network info cache due to event network-changed-b0597686-1f09-4b3d-ad11-27d3fbbdde6c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.141 186245 DEBUG oslo_concurrency.lockutils [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.142 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.part /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.converted" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.145 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.190 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be.converted --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.191 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.192 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.195 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.196 186245 INFO oslo.privsep.daemon [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpwybmj4le/privsep.sock']
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.727 186245 INFO oslo.privsep.daemon [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Spawned new privsep daemon via rootwrap
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.648 211770 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.651 211770 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.653 211770 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.653 211770 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211770
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.790 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.846 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.847 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.848 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.848 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.852 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.852 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.895 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.896 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.924 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.925 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.925 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.968 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.969 186245 DEBUG nova.virt.disk.api [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:18:59 compute-0 nova_compute[186241]: 2025-11-25 06:18:59.969 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.014 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.014 186245 DEBUG nova.virt.disk.api [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.015 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.015 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Ensure instance console log exists: /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.015 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.016 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.016 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:00 compute-0 nova_compute[186241]: 2025-11-25 06:19:00.120 186245 DEBUG nova.network.neutron [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.117 186245 DEBUG nova.network.neutron [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updating instance_info_cache with network_info: [{"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.621 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.622 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Instance network_info: |[{"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.622 186245 DEBUG oslo_concurrency.lockutils [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.622 186245 DEBUG nova.network.neutron [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Refreshing network info cache for port b0597686-1f09-4b3d-ad11-27d3fbbdde6c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.625 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Start _get_guest_xml network_info=[{"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.628 186245 WARNING nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.629 186245 DEBUG nova.virt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1724422885', uuid='4b7f1c44-c36c-4ce9-b498-4984df4111b3'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051542.6291206) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.633 186245 DEBUG nova.virt.libvirt.host [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.633 186245 DEBUG nova.virt.libvirt.host [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.635 186245 DEBUG nova.virt.libvirt.host [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.636 186245 DEBUG nova.virt.libvirt.host [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.636 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.636 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.636 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.637 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.637 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.637 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.637 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.637 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.638 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.638 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.638 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.638 186245 DEBUG nova.virt.hardware [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.641 186245 DEBUG nova.privsep.utils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.642 186245 DEBUG nova.virt.libvirt.vif [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1724422885',display_name='tempest-TestNetworkBasicOps-server-1724422885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1724422885',id=1,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw9bVqco8U0Wp4LUSfq3rP2Y3G4d9+HrSr18rZe33vIbLU5CgQl8aOPgkakXGAmX+BqLFeIZmSSNp4FTfanwb1Zj2Pr3fHYFwGKmsMT3gYm+uRVfIhQGs0huoytbWm/+w==',key_name='tempest-TestNetworkBasicOps-1956672502',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-otdipvg7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:18:56Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=4b7f1c44-c36c-4ce9-b498-4984df4111b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.642 186245 DEBUG nova.network.os_vif_util [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.643 186245 DEBUG nova.network.os_vif_util [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:c9:8b,bridge_name='br-int',has_traffic_filtering=True,id=b0597686-1f09-4b3d-ad11-27d3fbbdde6c,network=Network(0e4c5e99-aead-49a3-910e-5959edf0d03a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0597686-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:19:02 compute-0 nova_compute[186241]: 2025-11-25 06:19:02.644 186245 DEBUG nova.objects.instance [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b7f1c44-c36c-4ce9-b498-4984df4111b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:19:03 compute-0 podman[211787]: 2025-11-25 06:19:03.061895513 +0000 UTC m=+0.040297140 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.147 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <uuid>4b7f1c44-c36c-4ce9-b498-4984df4111b3</uuid>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <name>instance-00000001</name>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-1724422885</nova:name>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:19:02</nova:creationTime>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         <nova:port uuid="b0597686-1f09-4b3d-ad11-27d3fbbdde6c">
Nov 25 06:19:03 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <system>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <entry name="serial">4b7f1c44-c36c-4ce9-b498-4984df4111b3</entry>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <entry name="uuid">4b7f1c44-c36c-4ce9-b498-4984df4111b3</entry>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </system>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <os>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   </os>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <features>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   </features>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.config"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:cd:c9:8b"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <target dev="tapb0597686-1f"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/console.log" append="off"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <video>
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </video>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:19:03 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:19:03 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:19:03 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:19:03 compute-0 nova_compute[186241]: </domain>
Nov 25 06:19:03 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.147 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Preparing to wait for external event network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.148 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.148 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.148 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.148 186245 DEBUG nova.virt.libvirt.vif [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1724422885',display_name='tempest-TestNetworkBasicOps-server-1724422885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1724422885',id=1,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw9bVqco8U0Wp4LUSfq3rP2Y3G4d9+HrSr18rZe33vIbLU5CgQl8aOPgkakXGAmX+BqLFeIZmSSNp4FTfanwb1Zj2Pr3fHYFwGKmsMT3gYm+uRVfIhQGs0huoytbWm/+w==',key_name='tempest-TestNetworkBasicOps-1956672502',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-otdipvg7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:18:56Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=4b7f1c44-c36c-4ce9-b498-4984df4111b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.149 186245 DEBUG nova.network.os_vif_util [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.149 186245 DEBUG nova.network.os_vif_util [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:c9:8b,bridge_name='br-int',has_traffic_filtering=True,id=b0597686-1f09-4b3d-ad11-27d3fbbdde6c,network=Network(0e4c5e99-aead-49a3-910e-5959edf0d03a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0597686-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.149 186245 DEBUG os_vif [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:c9:8b,bridge_name='br-int',has_traffic_filtering=True,id=b0597686-1f09-4b3d-ad11-27d3fbbdde6c,network=Network(0e4c5e99-aead-49a3-910e-5959edf0d03a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0597686-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.179 186245 DEBUG ovsdbapp.backend.ovs_idl [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.179 186245 DEBUG ovsdbapp.backend.ovs_idl [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.179 186245 DEBUG ovsdbapp.backend.ovs_idl [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.179 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.180 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.180 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.180 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.181 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.183 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.189 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.190 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.190 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.190 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.190 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f25cb668-6e37-53d4-99a9-35e649aad9a4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.191 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.194 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.195 186245 INFO oslo.privsep.daemon [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpc4j3ach_/privsep.sock']
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.722 186245 INFO oslo.privsep.daemon [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Spawned new privsep daemon via rootwrap
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.645 211808 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.648 211808 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.650 211808 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.650 211808 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211808
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.946 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.947 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb0597686-1f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.948 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb0597686-1f, col_values=(('qos', UUID('56e8b9e0-776d-48c7-890b-95c0dca982d3')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.949 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb0597686-1f, col_values=(('external_ids', {'iface-id': 'b0597686-1f09-4b3d-ad11-27d3fbbdde6c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:c9:8b', 'vm-uuid': '4b7f1c44-c36c-4ce9-b498-4984df4111b3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.950 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 NetworkManager[55345]: <info>  [1764051543.9506] manager: (tapb0597686-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.952 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.954 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:03 compute-0 nova_compute[186241]: 2025-11-25 06:19:03.956 186245 INFO os_vif [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:c9:8b,bridge_name='br-int',has_traffic_filtering=True,id=b0597686-1f09-4b3d-ad11-27d3fbbdde6c,network=Network(0e4c5e99-aead-49a3-910e-5959edf0d03a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0597686-1f')
Nov 25 06:19:05 compute-0 nova_compute[186241]: 2025-11-25 06:19:05.483 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:19:05 compute-0 nova_compute[186241]: 2025-11-25 06:19:05.484 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:19:05 compute-0 nova_compute[186241]: 2025-11-25 06:19:05.484 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:cd:c9:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:19:05 compute-0 nova_compute[186241]: 2025-11-25 06:19:05.485 186245 INFO nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Using config drive
Nov 25 06:19:06 compute-0 nova_compute[186241]: 2025-11-25 06:19:06.865 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.156 186245 INFO nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Creating config drive at /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.config
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.161 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpszz9j8_m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.278 186245 DEBUG oslo_concurrency.processutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpszz9j8_m" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:07 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 25 06:19:07 compute-0 kernel: tapb0597686-1f: entered promiscuous mode
Nov 25 06:19:07 compute-0 NetworkManager[55345]: <info>  [1764051547.3233] manager: (tapb0597686-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Nov 25 06:19:07 compute-0 ovn_controller[95135]: 2025-11-25T06:19:07Z|00039|binding|INFO|Claiming lport b0597686-1f09-4b3d-ad11-27d3fbbdde6c for this chassis.
Nov 25 06:19:07 compute-0 ovn_controller[95135]: 2025-11-25T06:19:07Z|00040|binding|INFO|b0597686-1f09-4b3d-ad11-27d3fbbdde6c: Claiming fa:16:3e:cd:c9:8b 10.100.0.12
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.326 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.330 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.336 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:c9:8b 10.100.0.12'], port_security=['fa:16:3e:cd:c9:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4b7f1c44-c36c-4ce9-b498-4984df4111b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a247d86c-48ea-4aa8-9306-726933f4704f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6fb2701-3644-4fc1-81d0-634fa89abf62, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=b0597686-1f09-4b3d-ad11-27d3fbbdde6c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.337 103953 INFO neutron.agent.ovn.metadata.agent [-] Port b0597686-1f09-4b3d-ad11-27d3fbbdde6c in datapath 0e4c5e99-aead-49a3-910e-5959edf0d03a bound to our chassis
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.338 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0e4c5e99-aead-49a3-910e-5959edf0d03a
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.354 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1bfda297-27e0-4d8c-ac6d-907c535fd4da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.355 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0e4c5e99-a1 in ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.358 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0e4c5e99-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.359 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[98b6a111-58f4-466f-b9ed-6e502a41f7f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.360 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ab1a1b2d-5ed4-415a-90e9-d542efbac6be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:07 compute-0 systemd-udevd[211835]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.373 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[16f99928-0714-4de9-a49d-4a62a5f9212b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:07 compute-0 NetworkManager[55345]: <info>  [1764051547.3758] device (tapb0597686-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:19:07 compute-0 NetworkManager[55345]: <info>  [1764051547.3765] device (tapb0597686-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:19:07 compute-0 systemd-machined[152921]: New machine qemu-1-instance-00000001.
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.397 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8f013428-cec4-4573-8941-e68f3f9ffaba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.398 103953 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpkmzyi8hd/privsep.sock']
Nov 25 06:19:07 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.398 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:07 compute-0 ovn_controller[95135]: 2025-11-25T06:19:07Z|00041|binding|INFO|Setting lport b0597686-1f09-4b3d-ad11-27d3fbbdde6c ovn-installed in OVS
Nov 25 06:19:07 compute-0 ovn_controller[95135]: 2025-11-25T06:19:07Z|00042|binding|INFO|Setting lport b0597686-1f09-4b3d-ad11-27d3fbbdde6c up in Southbound
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.405 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.408 186245 DEBUG nova.network.neutron [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updated VIF entry in instance network info cache for port b0597686-1f09-4b3d-ad11-27d3fbbdde6c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.408 186245 DEBUG nova.network.neutron [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updating instance_info_cache with network_info: [{"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.687 186245 DEBUG nova.compute.manager [req-3b288601-8d04-4624-bf9d-f5d03796cdbf req-2836ffc5-f36b-4130-a8be-adb3a5111594 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.687 186245 DEBUG oslo_concurrency.lockutils [req-3b288601-8d04-4624-bf9d-f5d03796cdbf req-2836ffc5-f36b-4130-a8be-adb3a5111594 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.687 186245 DEBUG oslo_concurrency.lockutils [req-3b288601-8d04-4624-bf9d-f5d03796cdbf req-2836ffc5-f36b-4130-a8be-adb3a5111594 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.687 186245 DEBUG oslo_concurrency.lockutils [req-3b288601-8d04-4624-bf9d-f5d03796cdbf req-2836ffc5-f36b-4130-a8be-adb3a5111594 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.688 186245 DEBUG nova.compute.manager [req-3b288601-8d04-4624-bf9d-f5d03796cdbf req-2836ffc5-f36b-4130-a8be-adb3a5111594 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Processing event network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.719 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.726 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.728 186245 INFO nova.virt.libvirt.driver [-] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Instance spawned successfully.
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.729 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:19:07 compute-0 nova_compute[186241]: 2025-11-25 06:19:07.912 186245 DEBUG oslo_concurrency.lockutils [req-3c790620-c334-4cdf-a0f2-3e0eb5992472 req-e9734286-9c69-46dc-a1ae-a059179f9ec9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.942 103953 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.942 103953 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkmzyi8hd/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:366
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.865 211867 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.868 211867 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.870 211867 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.870 211867 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211867
Nov 25 06:19:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:07.945 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[058d1e28-29ea-4141-b0e0-1cf551cceb6b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.236 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.237 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.238 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.238 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.238 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.239 186245 DEBUG nova.virt.libvirt.driver [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.368 211867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.368 211867 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.368 211867 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.720 211867 INFO oslo_service.backend [-] Loading backend: eventlet
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.725 211867 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.744 186245 INFO nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Took 11.68 seconds to spawn the instance on the hypervisor.
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.745 186245 DEBUG nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.783 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[edcfedad-569d-46a3-a87c-f6d6e2e9420f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 systemd-udevd[211837]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:19:08 compute-0 NetworkManager[55345]: <info>  [1764051548.7962] manager: (tap0e4c5e99-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.795 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1f418515-c177-4a64-845d-e6f3816c74e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.820 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[b4bb0065-4d29-441f-862a-e25fbf8b812e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.823 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[093a37aa-ad50-41da-87e5-a4e177443b37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 NetworkManager[55345]: <info>  [1764051548.8382] device (tap0e4c5e99-a0): carrier: link connected
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.842 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4df35f-bae5-4465-b05d-b4cafa2ffe47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.855 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[333d953d-e580-4682-9463-60356a36f581]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e4c5e99-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:af:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 253724, 'reachable_time': 36934, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211883, 'error': None, 'target': 'ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.866 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[af6b675c-8d0e-4c82-aa6c-34d05378daee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe78:af8b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 253724, 'tstamp': 253724}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211884, 'error': None, 'target': 'ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.880 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5b8656-494f-44a5-8090-122df27817d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e4c5e99-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:af:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 253724, 'reachable_time': 36934, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211885, 'error': None, 'target': 'ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.901 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5df8a504-2926-44f5-9dfe-12f23d28685b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.940 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d357d5c1-046b-4c79-a1d1-9adcd99d94d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.940 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e4c5e99-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.941 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.941 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e4c5e99-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.942 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:08 compute-0 NetworkManager[55345]: <info>  [1764051548.9432] manager: (tap0e4c5e99-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 25 06:19:08 compute-0 kernel: tap0e4c5e99-a0: entered promiscuous mode
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.946 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0e4c5e99-a0, col_values=(('external_ids', {'iface-id': '41d1b45c-dacf-4079-b06f-ab644147f8e7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.947 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:08 compute-0 ovn_controller[95135]: 2025-11-25T06:19:08Z|00043|binding|INFO|Releasing lport 41d1b45c-dacf-4079-b06f-ab644147f8e7 from this chassis (sb_readonly=0)
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.949 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[10d0ab38-d916-4c7b-b8a1-2d595d1c4289]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.949 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.950 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.950 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.950 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0e4c5e99-aead-49a3-910e-5959edf0d03a disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.950 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.951 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[af3e0007-d49b-4bb9-8ebf-ea8126c1d974]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.951 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.952 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[472e79e5-91c0-4e92-aca5-e8dcd9999e97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.952 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-0e4c5e99-aead-49a3-910e-5959edf0d03a
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 0e4c5e99-aead-49a3-910e-5959edf0d03a
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:19:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:08.954 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'env', 'PROCESS_TAG=haproxy-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0e4c5e99-aead-49a3-910e-5959edf0d03a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:19:08 compute-0 nova_compute[186241]: 2025-11-25 06:19:08.961 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:09 compute-0 podman[211915]: 2025-11-25 06:19:09.240334209 +0000 UTC m=+0.034144541 container create 7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.vendor=CentOS)
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.263 186245 INFO nova.compute.manager [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Took 16.79 seconds to build instance.
Nov 25 06:19:09 compute-0 systemd[1]: Started libpod-conmon-7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94.scope.
Nov 25 06:19:09 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:19:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec5fce7cd471331274e23735b35264bec29183e6ccaeb4597dc20ca064d894fe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:19:09 compute-0 podman[211925]: 2025-11-25 06:19:09.309740065 +0000 UTC m=+0.047899562 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:19:09 compute-0 podman[211915]: 2025-11-25 06:19:09.315022975 +0000 UTC m=+0.108833307 container init 7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 25 06:19:09 compute-0 podman[211915]: 2025-11-25 06:19:09.319691456 +0000 UTC m=+0.113501788 container start 7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 06:19:09 compute-0 podman[211915]: 2025-11-25 06:19:09.226021577 +0000 UTC m=+0.019831929 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:19:09 compute-0 neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a[211938]: [NOTICE]   (211953) : New worker (211955) forked
Nov 25 06:19:09 compute-0 neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a[211938]: [NOTICE]   (211953) : Loading success.
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.767 186245 DEBUG oslo_concurrency.lockutils [None req-a9af0950-d550-429d-a3fb-6c937164d052 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.863 186245 DEBUG nova.compute.manager [req-5d2548e2-5d0e-451e-90d1-ab92e425e91a req-99f3d274-12a2-4bf7-b29f-be2af30b8624 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.863 186245 DEBUG oslo_concurrency.lockutils [req-5d2548e2-5d0e-451e-90d1-ab92e425e91a req-99f3d274-12a2-4bf7-b29f-be2af30b8624 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.864 186245 DEBUG oslo_concurrency.lockutils [req-5d2548e2-5d0e-451e-90d1-ab92e425e91a req-99f3d274-12a2-4bf7-b29f-be2af30b8624 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.864 186245 DEBUG oslo_concurrency.lockutils [req-5d2548e2-5d0e-451e-90d1-ab92e425e91a req-99f3d274-12a2-4bf7-b29f-be2af30b8624 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.864 186245 DEBUG nova.compute.manager [req-5d2548e2-5d0e-451e-90d1-ab92e425e91a req-99f3d274-12a2-4bf7-b29f-be2af30b8624 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] No waiting events found dispatching network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:19:09 compute-0 nova_compute[186241]: 2025-11-25 06:19:09.864 186245 WARNING nova.compute.manager [req-5d2548e2-5d0e-451e-90d1-ab92e425e91a req-99f3d274-12a2-4bf7-b29f-be2af30b8624 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received unexpected event network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c for instance with vm_state active and task_state None.
Nov 25 06:19:11 compute-0 nova_compute[186241]: 2025-11-25 06:19:11.866 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:13 compute-0 nova_compute[186241]: 2025-11-25 06:19:13.950 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6334] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6339] device (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6347] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6350] device (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6361] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 25 06:19:14 compute-0 nova_compute[186241]: 2025-11-25 06:19:14.636 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6369] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6372] device (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 25 06:19:14 compute-0 NetworkManager[55345]: <info>  [1764051554.6376] device (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 25 06:19:14 compute-0 ovn_controller[95135]: 2025-11-25T06:19:14Z|00044|binding|INFO|Releasing lport 41d1b45c-dacf-4079-b06f-ab644147f8e7 from this chassis (sb_readonly=0)
Nov 25 06:19:14 compute-0 nova_compute[186241]: 2025-11-25 06:19:14.665 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:14 compute-0 ovn_controller[95135]: 2025-11-25T06:19:14Z|00045|binding|INFO|Releasing lport 41d1b45c-dacf-4079-b06f-ab644147f8e7 from this chassis (sb_readonly=0)
Nov 25 06:19:14 compute-0 nova_compute[186241]: 2025-11-25 06:19:14.669 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:15 compute-0 nova_compute[186241]: 2025-11-25 06:19:15.310 186245 DEBUG nova.compute.manager [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-changed-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:19:15 compute-0 nova_compute[186241]: 2025-11-25 06:19:15.311 186245 DEBUG nova.compute.manager [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Refreshing instance network info cache due to event network-changed-b0597686-1f09-4b3d-ad11-27d3fbbdde6c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:19:15 compute-0 nova_compute[186241]: 2025-11-25 06:19:15.311 186245 DEBUG oslo_concurrency.lockutils [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:19:15 compute-0 nova_compute[186241]: 2025-11-25 06:19:15.311 186245 DEBUG oslo_concurrency.lockutils [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:19:15 compute-0 nova_compute[186241]: 2025-11-25 06:19:15.311 186245 DEBUG nova.network.neutron [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Refreshing network info cache for port b0597686-1f09-4b3d-ad11-27d3fbbdde6c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:19:16 compute-0 podman[211961]: 2025-11-25 06:19:16.083037212 +0000 UTC m=+0.060687861 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:19:16 compute-0 nova_compute[186241]: 2025-11-25 06:19:16.868 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:18 compute-0 nova_compute[186241]: 2025-11-25 06:19:18.952 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:19 compute-0 ovn_controller[95135]: 2025-11-25T06:19:19Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:c9:8b 10.100.0.12
Nov 25 06:19:19 compute-0 ovn_controller[95135]: 2025-11-25T06:19:19Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:c9:8b 10.100.0.12
Nov 25 06:19:20 compute-0 podman[211996]: 2025-11-25 06:19:20.065586607 +0000 UTC m=+0.041964665 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:19:20 compute-0 podman[211995]: 2025-11-25 06:19:20.069963188 +0000 UTC m=+0.047643570 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.106 186245 DEBUG nova.network.neutron [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updated VIF entry in instance network info cache for port b0597686-1f09-4b3d-ad11-27d3fbbdde6c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.107 186245 DEBUG nova.network.neutron [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updating instance_info_cache with network_info: [{"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.211 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.211 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.610 186245 DEBUG oslo_concurrency.lockutils [req-cf5137f6-aa90-4910-a454-91a7106bb2f6 req-e92d0818-6699-4e41-b595-bcaedde6433f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.716 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.716 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.716 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.716 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.717 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.717 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.717 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:19:20 compute-0 nova_compute[186241]: 2025-11-25 06:19:20.717 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:19:21 compute-0 nova_compute[186241]: 2025-11-25 06:19:21.227 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:21 compute-0 nova_compute[186241]: 2025-11-25 06:19:21.227 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:21 compute-0 nova_compute[186241]: 2025-11-25 06:19:21.227 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:21 compute-0 nova_compute[186241]: 2025-11-25 06:19:21.227 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:19:21 compute-0 nova_compute[186241]: 2025-11-25 06:19:21.870 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.253 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.300 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.301 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.344 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.530 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.531 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5622MB free_disk=72.98854064941406GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.532 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:22 compute-0 nova_compute[186241]: 2025-11-25 06:19:22.532 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:23 compute-0 nova_compute[186241]: 2025-11-25 06:19:23.563 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 4b7f1c44-c36c-4ce9-b498-4984df4111b3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:19:23 compute-0 nova_compute[186241]: 2025-11-25 06:19:23.564 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:19:23 compute-0 nova_compute[186241]: 2025-11-25 06:19:23.564 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:19:23 compute-0 nova_compute[186241]: 2025-11-25 06:19:23.592 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:19:23 compute-0 nova_compute[186241]: 2025-11-25 06:19:23.953 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.111 186245 ERROR nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] [req-9c41105d-323c-4c65-bf11-1b7ce9819426] Failed to update inventory to [{'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID b9b31722-b833-4ea1-a013-247935742e36.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-9c41105d-323c-4c65-bf11-1b7ce9819426"}]}
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.128 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing inventories for resource provider b9b31722-b833-4ea1-a013-247935742e36 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.140 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating ProviderTree inventory for provider b9b31722-b833-4ea1-a013-247935742e36 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.140 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.150 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing aggregate associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.166 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing trait associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX512VPCLMULQDQ,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_ARCH_X86_64,HW_CPU_X86_AMD_SVM,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX512VAES,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX2,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.187 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.723 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updated inventory for provider b9b31722-b833-4ea1-a013-247935742e36 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:975
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.724 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating resource provider b9b31722-b833-4ea1-a013-247935742e36 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 25 06:19:24 compute-0 nova_compute[186241]: 2025-11-25 06:19:24.724 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:19:25 compute-0 nova_compute[186241]: 2025-11-25 06:19:25.229 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:19:25 compute-0 nova_compute[186241]: 2025-11-25 06:19:25.229 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:25 compute-0 nova_compute[186241]: 2025-11-25 06:19:25.665 186245 INFO nova.compute.manager [None req-81fd3283-1446-4ceb-b31c-95f2bfca164b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Get console output
Nov 25 06:19:25 compute-0 nova_compute[186241]: 2025-11-25 06:19:25.741 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:19:26 compute-0 podman[212041]: 2025-11-25 06:19:26.057852245 +0000 UTC m=+0.037620344 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 25 06:19:26 compute-0 nova_compute[186241]: 2025-11-25 06:19:26.873 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:28 compute-0 nova_compute[186241]: 2025-11-25 06:19:28.954 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:29 compute-0 podman[212060]: 2025-11-25 06:19:29.055221025 +0000 UTC m=+0.033768904 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 06:19:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:29.108 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:19:29 compute-0 nova_compute[186241]: 2025-11-25 06:19:29.108 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:29.109 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:19:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:31.613 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:ad:ac 10.100.0.18'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62086cc8-0f51-42bb-a9b0-6996044ab0f9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e1a2d7e2-0581-4567-97e7-51cd94724395) old=Port_Binding(mac=['fa:16:3e:d4:ad:ac'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:19:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:31.614 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e1a2d7e2-0581-4567-97e7-51cd94724395 in datapath 622ce19f-960f-4b6d-93c3-22c8073dbf77 updated
Nov 25 06:19:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:31.615 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 622ce19f-960f-4b6d-93c3-22c8073dbf77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:19:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:31.616 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5d170b-818c-4a2c-8a85-66a423d1bb48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:31 compute-0 nova_compute[186241]: 2025-11-25 06:19:31.876 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:33 compute-0 nova_compute[186241]: 2025-11-25 06:19:33.955 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:34 compute-0 podman[212079]: 2025-11-25 06:19:34.065122507 +0000 UTC m=+0.040942056 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 06:19:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:35.109 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:36 compute-0 nova_compute[186241]: 2025-11-25 06:19:36.876 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:38 compute-0 nova_compute[186241]: 2025-11-25 06:19:38.560 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:38 compute-0 nova_compute[186241]: 2025-11-25 06:19:38.560 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:38 compute-0 nova_compute[186241]: 2025-11-25 06:19:38.956 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:39 compute-0 nova_compute[186241]: 2025-11-25 06:19:39.063 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:19:39 compute-0 nova_compute[186241]: 2025-11-25 06:19:39.591 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:39 compute-0 nova_compute[186241]: 2025-11-25 06:19:39.591 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:39 compute-0 nova_compute[186241]: 2025-11-25 06:19:39.596 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:19:39 compute-0 nova_compute[186241]: 2025-11-25 06:19:39.597 186245 INFO nova.compute.claims [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:19:40 compute-0 podman[212096]: 2025-11-25 06:19:40.059115259 +0000 UTC m=+0.034280513 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:19:40 compute-0 nova_compute[186241]: 2025-11-25 06:19:40.642 186245 DEBUG nova.compute.provider_tree [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:19:41 compute-0 nova_compute[186241]: 2025-11-25 06:19:41.145 186245 DEBUG nova.scheduler.client.report [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:19:41 compute-0 nova_compute[186241]: 2025-11-25 06:19:41.650 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:41 compute-0 nova_compute[186241]: 2025-11-25 06:19:41.650 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:19:41 compute-0 nova_compute[186241]: 2025-11-25 06:19:41.878 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:42 compute-0 nova_compute[186241]: 2025-11-25 06:19:42.156 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:19:42 compute-0 nova_compute[186241]: 2025-11-25 06:19:42.157 186245 DEBUG nova.network.neutron [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:19:42 compute-0 nova_compute[186241]: 2025-11-25 06:19:42.442 186245 DEBUG nova.policy [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:19:42 compute-0 nova_compute[186241]: 2025-11-25 06:19:42.661 186245 INFO nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:19:43 compute-0 nova_compute[186241]: 2025-11-25 06:19:43.164 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:19:43 compute-0 nova_compute[186241]: 2025-11-25 06:19:43.613 186245 DEBUG nova.network.neutron [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Successfully created port: 8679bd61-0016-49b1-a137-221904386339 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:19:43 compute-0 nova_compute[186241]: 2025-11-25 06:19:43.958 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.174 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.175 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.175 186245 INFO nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Creating image(s)
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.176 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.176 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.177 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.177 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.180 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.181 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.224 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.225 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.226 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.226 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.229 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.230 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.271 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.272 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.290 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.291 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.291 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.334 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.335 186245 DEBUG nova.virt.disk.api [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.335 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.379 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.380 186245 DEBUG nova.virt.disk.api [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.380 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.380 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Ensure instance console log exists: /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.381 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.381 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.381 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.412 186245 DEBUG nova.network.neutron [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Successfully updated port: 8679bd61-0016-49b1-a137-221904386339 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.546 186245 DEBUG nova.compute.manager [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received event network-changed-8679bd61-0016-49b1-a137-221904386339 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.546 186245 DEBUG nova.compute.manager [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Refreshing instance network info cache due to event network-changed-8679bd61-0016-49b1-a137-221904386339. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.548 186245 DEBUG oslo_concurrency.lockutils [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-44e7315b-3c8f-4079-8553-02a6fd6f107d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.548 186245 DEBUG oslo_concurrency.lockutils [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-44e7315b-3c8f-4079-8553-02a6fd6f107d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.548 186245 DEBUG nova.network.neutron [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Refreshing network info cache for port 8679bd61-0016-49b1-a137-221904386339 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:19:44 compute-0 nova_compute[186241]: 2025-11-25 06:19:44.915 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-44e7315b-3c8f-4079-8553-02a6fd6f107d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:19:46 compute-0 nova_compute[186241]: 2025-11-25 06:19:46.107 186245 DEBUG nova.network.neutron [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:19:46 compute-0 nova_compute[186241]: 2025-11-25 06:19:46.879 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:47 compute-0 podman[212132]: 2025-11-25 06:19:47.079040381 +0000 UTC m=+0.057816470 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:19:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:47.341 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:47.341 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:47.341 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:47 compute-0 nova_compute[186241]: 2025-11-25 06:19:47.350 186245 DEBUG nova.network.neutron [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:19:47 compute-0 nova_compute[186241]: 2025-11-25 06:19:47.854 186245 DEBUG oslo_concurrency.lockutils [req-38d91aa2-dc37-4f16-98d6-aa66b7e1fc52 req-7bf826af-99bc-4a76-95ca-6ee2b58c4476 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-44e7315b-3c8f-4079-8553-02a6fd6f107d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:19:47 compute-0 nova_compute[186241]: 2025-11-25 06:19:47.854 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-44e7315b-3c8f-4079-8553-02a6fd6f107d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:19:47 compute-0 nova_compute[186241]: 2025-11-25 06:19:47.855 186245 DEBUG nova.network.neutron [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:19:48 compute-0 nova_compute[186241]: 2025-11-25 06:19:48.960 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:49 compute-0 nova_compute[186241]: 2025-11-25 06:19:49.119 186245 DEBUG nova.network.neutron [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:19:51 compute-0 podman[212158]: 2025-11-25 06:19:51.064012591 +0000 UTC m=+0.039218871 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:19:51 compute-0 podman[212157]: 2025-11-25 06:19:51.066249909 +0000 UTC m=+0.044431628 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.115 186245 DEBUG nova.network.neutron [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Updating instance_info_cache with network_info: [{"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.618 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-44e7315b-3c8f-4079-8553-02a6fd6f107d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.618 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Instance network_info: |[{"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.620 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Start _get_guest_xml network_info=[{"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.623 186245 WARNING nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.624 186245 DEBUG nova.virt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-953490707', uuid='44e7315b-3c8f-4079-8553-02a6fd6f107d'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051591.6245403) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.629 186245 DEBUG nova.virt.libvirt.host [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.629 186245 DEBUG nova.virt.libvirt.host [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.631 186245 DEBUG nova.virt.libvirt.host [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.631 186245 DEBUG nova.virt.libvirt.host [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.632 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.632 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.632 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.632 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.633 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.633 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.633 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.633 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.633 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.634 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.634 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.634 186245 DEBUG nova.virt.hardware [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.636 186245 DEBUG nova.virt.libvirt.vif [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:19:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-953490707',display_name='tempest-TestNetworkBasicOps-server-953490707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-953490707',id=2,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+Cg1gQgYMJkyvh+3AQUwPVx+19d/oRZHSGYziiVMz+/V6NCZoJWvcWk5srLg/9x/dqcPR0nplDGGiH3SNDL8e2kvlFYD480J7OURsqH5WOEs4u+Wwm+3s34NJcZa9xkQ==',key_name='tempest-TestNetworkBasicOps-348265766',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-yl9ehua5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:19:43Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=44e7315b-3c8f-4079-8553-02a6fd6f107d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.637 186245 DEBUG nova.network.os_vif_util [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.637 186245 DEBUG nova.network.os_vif_util [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:18:bc,bridge_name='br-int',has_traffic_filtering=True,id=8679bd61-0016-49b1-a137-221904386339,network=Network(622ce19f-960f-4b6d-93c3-22c8073dbf77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8679bd61-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.638 186245 DEBUG nova.objects.instance [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 44e7315b-3c8f-4079-8553-02a6fd6f107d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:19:51 compute-0 nova_compute[186241]: 2025-11-25 06:19:51.880 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.141 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <uuid>44e7315b-3c8f-4079-8553-02a6fd6f107d</uuid>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <name>instance-00000002</name>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-953490707</nova:name>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:19:51</nova:creationTime>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         <nova:port uuid="8679bd61-0016-49b1-a137-221904386339">
Nov 25 06:19:52 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.30" ipVersion="4"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <system>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <entry name="serial">44e7315b-3c8f-4079-8553-02a6fd6f107d</entry>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <entry name="uuid">44e7315b-3c8f-4079-8553-02a6fd6f107d</entry>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </system>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <os>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   </os>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <features>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   </features>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.config"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:a6:18:bc"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <target dev="tap8679bd61-00"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/console.log" append="off"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <video>
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </video>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:19:52 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:19:52 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:19:52 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:19:52 compute-0 nova_compute[186241]: </domain>
Nov 25 06:19:52 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.142 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Preparing to wait for external event network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.143 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.143 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.143 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.144 186245 DEBUG nova.virt.libvirt.vif [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:19:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-953490707',display_name='tempest-TestNetworkBasicOps-server-953490707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-953490707',id=2,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+Cg1gQgYMJkyvh+3AQUwPVx+19d/oRZHSGYziiVMz+/V6NCZoJWvcWk5srLg/9x/dqcPR0nplDGGiH3SNDL8e2kvlFYD480J7OURsqH5WOEs4u+Wwm+3s34NJcZa9xkQ==',key_name='tempest-TestNetworkBasicOps-348265766',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-yl9ehua5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:19:43Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=44e7315b-3c8f-4079-8553-02a6fd6f107d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.144 186245 DEBUG nova.network.os_vif_util [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.144 186245 DEBUG nova.network.os_vif_util [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:18:bc,bridge_name='br-int',has_traffic_filtering=True,id=8679bd61-0016-49b1-a137-221904386339,network=Network(622ce19f-960f-4b6d-93c3-22c8073dbf77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8679bd61-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.145 186245 DEBUG os_vif [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:18:bc,bridge_name='br-int',has_traffic_filtering=True,id=8679bd61-0016-49b1-a137-221904386339,network=Network(622ce19f-960f-4b6d-93c3-22c8073dbf77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8679bd61-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.145 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.145 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.146 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.146 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.146 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3a750ea0-3578-51d8-94be-bb3dbd6b5ab4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.148 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.150 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.150 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8679bd61-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.151 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8679bd61-00, col_values=(('qos', UUID('b4096eb3-fff6-4542-8db4-14e004811610')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.151 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8679bd61-00, col_values=(('external_ids', {'iface-id': '8679bd61-0016-49b1-a137-221904386339', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a6:18:bc', 'vm-uuid': '44e7315b-3c8f-4079-8553-02a6fd6f107d'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:52 compute-0 NetworkManager[55345]: <info>  [1764051592.1526] manager: (tap8679bd61-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.153 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.156 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:52 compute-0 nova_compute[186241]: 2025-11-25 06:19:52.156 186245 INFO os_vif [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:18:bc,bridge_name='br-int',has_traffic_filtering=True,id=8679bd61-0016-49b1-a137-221904386339,network=Network(622ce19f-960f-4b6d-93c3-22c8073dbf77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8679bd61-00')
Nov 25 06:19:53 compute-0 nova_compute[186241]: 2025-11-25 06:19:53.677 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:19:53 compute-0 nova_compute[186241]: 2025-11-25 06:19:53.678 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:19:53 compute-0 nova_compute[186241]: 2025-11-25 06:19:53.678 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:a6:18:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:19:53 compute-0 nova_compute[186241]: 2025-11-25 06:19:53.679 186245 INFO nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Using config drive
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.409 186245 INFO nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Creating config drive at /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.config
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.413 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmph9gq1w9i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.530 186245 DEBUG oslo_concurrency.processutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmph9gq1w9i" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:19:55 compute-0 kernel: tap8679bd61-00: entered promiscuous mode
Nov 25 06:19:55 compute-0 NetworkManager[55345]: <info>  [1764051595.5633] manager: (tap8679bd61-00): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Nov 25 06:19:55 compute-0 ovn_controller[95135]: 2025-11-25T06:19:55Z|00046|binding|INFO|Claiming lport 8679bd61-0016-49b1-a137-221904386339 for this chassis.
Nov 25 06:19:55 compute-0 ovn_controller[95135]: 2025-11-25T06:19:55Z|00047|binding|INFO|8679bd61-0016-49b1-a137-221904386339: Claiming fa:16:3e:a6:18:bc 10.100.0.30
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.568 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.571 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:18:bc 10.100.0.30'], port_security=['fa:16:3e:a6:18:bc 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': '44e7315b-3c8f-4079-8553-02a6fd6f107d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2944a666-502c-4b4a-991f-0b2eb4d34ba8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62086cc8-0f51-42bb-a9b0-6996044ab0f9, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=8679bd61-0016-49b1-a137-221904386339) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.572 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 8679bd61-0016-49b1-a137-221904386339 in datapath 622ce19f-960f-4b6d-93c3-22c8073dbf77 bound to our chassis
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.574 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 622ce19f-960f-4b6d-93c3-22c8073dbf77
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.581 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[db98883b-12bd-4d99-8a52-151dff015010]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.582 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap622ce19f-91 in ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.583 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap622ce19f-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.584 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[489d3e45-4136-465c-81bd-fa465a89bdce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.584 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7feb0736-0a47-46be-a95c-3fa19b59c79a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 systemd-udevd[212217]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.604 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[f174aaa0-2620-47aa-8b43-3ea7b8d6ece4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_controller[95135]: 2025-11-25T06:19:55Z|00048|binding|INFO|Setting lport 8679bd61-0016-49b1-a137-221904386339 ovn-installed in OVS
Nov 25 06:19:55 compute-0 ovn_controller[95135]: 2025-11-25T06:19:55Z|00049|binding|INFO|Setting lport 8679bd61-0016-49b1-a137-221904386339 up in Southbound
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.607 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:55 compute-0 NetworkManager[55345]: <info>  [1764051595.6109] device (tap8679bd61-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:19:55 compute-0 NetworkManager[55345]: <info>  [1764051595.6118] device (tap8679bd61-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:19:55 compute-0 systemd-machined[152921]: New machine qemu-2-instance-00000002.
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.617 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca2cc4e-50b1-439e-b0b8-e9148d2bc1d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.638 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[46dc803f-f3cb-4a3b-8efc-1eb8c8f8b56c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 NetworkManager[55345]: <info>  [1764051595.6426] manager: (tap622ce19f-90): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.642 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b2dfcf37-ab5d-4585-8b04-9099700a1365]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.667 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d77fde-dcca-457e-8760-8378a9d3a00d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.671 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[20d9b0de-4c8f-4410-a79a-e0f19bf8c2bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 NetworkManager[55345]: <info>  [1764051595.6874] device (tap622ce19f-90): carrier: link connected
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.691 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[2c053996-b7b1-4316-929f-7941b97b28a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.704 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[802fbf11-6717-4f06-8b6a-6b486e7b752e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap622ce19f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:ad:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 258409, 'reachable_time': 16597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212241, 'error': None, 'target': 'ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.716 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e6d0e942-0dc1-441d-9d27-f26906181ba1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:adac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 258409, 'tstamp': 258409}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212242, 'error': None, 'target': 'ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.728 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc92e79-6a5b-4bc6-b6d7-1787d1b1354a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap622ce19f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:ad:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 258409, 'reachable_time': 16597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212243, 'error': None, 'target': 'ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.751 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3c02ef19-055a-4b8c-894e-f9232dd8bec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.792 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[17285ee6-aa91-47cb-89ab-30c57d9a0a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.793 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap622ce19f-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.794 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.794 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap622ce19f-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:55 compute-0 kernel: tap622ce19f-90: entered promiscuous mode
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.795 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:55 compute-0 NetworkManager[55345]: <info>  [1764051595.7961] manager: (tap622ce19f-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.803 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap622ce19f-90, col_values=(('external_ids', {'iface-id': 'e1a2d7e2-0581-4567-97e7-51cd94724395'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.804 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:55 compute-0 ovn_controller[95135]: 2025-11-25T06:19:55Z|00050|binding|INFO|Releasing lport e1a2d7e2-0581-4567-97e7-51cd94724395 from this chassis (sb_readonly=0)
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.804 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:55 compute-0 nova_compute[186241]: 2025-11-25 06:19:55.817 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.818 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[96e299ce-c120-4394-99c9-f800bd57f666]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.819 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.819 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.819 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 622ce19f-960f-4b6d-93c3-22c8073dbf77 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.819 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.819 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c3adaf-b37e-409c-9d87-37afab37a4e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.820 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.820 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4884613d-f197-4da5-9be3-3cd390ee2092]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.820 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-622ce19f-960f-4b6d-93c3-22c8073dbf77
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 622ce19f-960f-4b6d-93c3-22c8073dbf77
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:19:55 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:19:55.821 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'env', 'PROCESS_TAG=haproxy-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/622ce19f-960f-4b6d-93c3-22c8073dbf77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:19:56 compute-0 podman[212279]: 2025-11-25 06:19:56.104968967 +0000 UTC m=+0.030112536 container create e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 25 06:19:56 compute-0 systemd[1]: Started libpod-conmon-e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d.scope.
Nov 25 06:19:56 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:19:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afda3b8ae862b9a6c0f05633a9a00ef0d28360810a58f1f83cf4225e71162aeb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:19:56 compute-0 podman[212279]: 2025-11-25 06:19:56.151760694 +0000 UTC m=+0.076904282 container init e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 06:19:56 compute-0 podman[212279]: 2025-11-25 06:19:56.15631097 +0000 UTC m=+0.081454540 container start e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:19:56 compute-0 podman[212279]: 2025-11-25 06:19:56.091592661 +0000 UTC m=+0.016736251 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:19:56 compute-0 neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77[212292]: [NOTICE]   (212309) : New worker (212313) forked
Nov 25 06:19:56 compute-0 neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77[212292]: [NOTICE]   (212309) : Loading success.
Nov 25 06:19:56 compute-0 podman[212288]: 2025-11-25 06:19:56.189260544 +0000 UTC m=+0.060717889 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.570 186245 DEBUG nova.compute.manager [req-bd78de3c-de77-488f-9cb2-571cd265eb24 req-1a62b9a8-4d05-4e55-a224-623277220c8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received event network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.570 186245 DEBUG oslo_concurrency.lockutils [req-bd78de3c-de77-488f-9cb2-571cd265eb24 req-1a62b9a8-4d05-4e55-a224-623277220c8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.570 186245 DEBUG oslo_concurrency.lockutils [req-bd78de3c-de77-488f-9cb2-571cd265eb24 req-1a62b9a8-4d05-4e55-a224-623277220c8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.571 186245 DEBUG oslo_concurrency.lockutils [req-bd78de3c-de77-488f-9cb2-571cd265eb24 req-1a62b9a8-4d05-4e55-a224-623277220c8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.571 186245 DEBUG nova.compute.manager [req-bd78de3c-de77-488f-9cb2-571cd265eb24 req-1a62b9a8-4d05-4e55-a224-623277220c8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Processing event network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.571 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.576 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.578 186245 INFO nova.virt.libvirt.driver [-] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Instance spawned successfully.
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.578 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:19:56 compute-0 nova_compute[186241]: 2025-11-25 06:19:56.882 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.086 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.087 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.087 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.087 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.088 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.088 186245 DEBUG nova.virt.libvirt.driver [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.152 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.593 186245 INFO nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Took 13.42 seconds to spawn the instance on the hypervisor.
Nov 25 06:19:57 compute-0 nova_compute[186241]: 2025-11-25 06:19:57.593 186245 DEBUG nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.105 186245 INFO nova.compute.manager [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Took 18.54 seconds to build instance.
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.607 186245 DEBUG oslo_concurrency.lockutils [None req-c9809e01-4893-4889-a02d-3c4a42ad785c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.719 186245 DEBUG nova.compute.manager [req-609a5ff8-e4f0-42d4-a0ee-a799934528c2 req-fa1b3acf-bacb-4dbf-b261-7deda113e8f1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received event network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.719 186245 DEBUG oslo_concurrency.lockutils [req-609a5ff8-e4f0-42d4-a0ee-a799934528c2 req-fa1b3acf-bacb-4dbf-b261-7deda113e8f1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.719 186245 DEBUG oslo_concurrency.lockutils [req-609a5ff8-e4f0-42d4-a0ee-a799934528c2 req-fa1b3acf-bacb-4dbf-b261-7deda113e8f1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.720 186245 DEBUG oslo_concurrency.lockutils [req-609a5ff8-e4f0-42d4-a0ee-a799934528c2 req-fa1b3acf-bacb-4dbf-b261-7deda113e8f1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.720 186245 DEBUG nova.compute.manager [req-609a5ff8-e4f0-42d4-a0ee-a799934528c2 req-fa1b3acf-bacb-4dbf-b261-7deda113e8f1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] No waiting events found dispatching network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:19:58 compute-0 nova_compute[186241]: 2025-11-25 06:19:58.720 186245 WARNING nova.compute.manager [req-609a5ff8-e4f0-42d4-a0ee-a799934528c2 req-fa1b3acf-bacb-4dbf-b261-7deda113e8f1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received unexpected event network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 for instance with vm_state active and task_state None.
Nov 25 06:19:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:19:59.549 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:19:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:19:59.929 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/4b7f1c44-c36c-4ce9-b498-4984df4111b3 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e471cc3fc7ae9ac5d8fd794e8aefa20e5f5c77c3e3edccb41964d2d46a7818d3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Nov 25 06:20:00 compute-0 podman[212318]: 2025-11-25 06:20:00.066692062 +0000 UTC m=+0.042961126 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64)
Nov 25 06:20:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:01.414 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1975 Content-Type: application/json Date: Tue, 25 Nov 2025 06:19:59 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-8c24dfc5-fb03-48b7-b9b6-51e3d65b5e78 x-openstack-request-id: req-8c24dfc5-fb03-48b7-b9b6-51e3d65b5e78 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Nov 25 06:20:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:01.415 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "4b7f1c44-c36c-4ce9-b498-4984df4111b3", "name": "tempest-TestNetworkBasicOps-server-1724422885", "status": "ACTIVE", "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "user_id": "66a05d0ca82146a5a458244c8e5364de", "metadata": {}, "hostId": "d6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5", "image": {"id": "5215c26e-be2f-40b4-ac47-476bfa3cf3f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/5215c26e-be2f-40b4-ac47-476bfa3cf3f2"}]}, "flavor": {"id": "53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac"}]}, "created": "2025-11-25T06:18:48Z", "updated": "2025-11-25T06:19:08Z", "addresses": {"tempest-network-smoke--433373947": [{"version": 4, "addr": "10.100.0.12", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:cd:c9:8b"}, {"version": 4, "addr": "192.168.122.230", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:cd:c9:8b"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/4b7f1c44-c36c-4ce9-b498-4984df4111b3"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/4b7f1c44-c36c-4ce9-b498-4984df4111b3"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-1956672502", "OS-SRV-USG:launched_at": "2025-11-25T06:19:08.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-614840828"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000001", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Nov 25 06:20:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:01.415 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/4b7f1c44-c36c-4ce9-b498-4984df4111b3 used request id req-8c24dfc5-fb03-48b7-b9b6-51e3d65b5e78 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Nov 25 06:20:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:01.416 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4b7f1c44-c36c-4ce9-b498-4984df4111b3', 'name': 'tempest-TestNetworkBasicOps-server-1724422885', 'flavor': {'id': '53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd90b557db9104ecfb816b1cdab8712bd', 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'hostId': 'd6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Nov 25 06:20:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:01.418 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/44e7315b-3c8f-4079-8553-02a6fd6f107d -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e471cc3fc7ae9ac5d8fd794e8aefa20e5f5c77c3e3edccb41964d2d46a7818d3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Nov 25 06:20:01 compute-0 nova_compute[186241]: 2025-11-25 06:20:01.886 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:02 compute-0 nova_compute[186241]: 2025-11-25 06:20:02.153 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.445 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1851 Content-Type: application/json Date: Tue, 25 Nov 2025 06:20:01 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-4d5e6821-7abc-4cf4-a745-5890619125bf x-openstack-request-id: req-4d5e6821-7abc-4cf4-a745-5890619125bf _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.445 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "44e7315b-3c8f-4079-8553-02a6fd6f107d", "name": "tempest-TestNetworkBasicOps-server-953490707", "status": "ACTIVE", "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "user_id": "66a05d0ca82146a5a458244c8e5364de", "metadata": {}, "hostId": "d6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5", "image": {"id": "5215c26e-be2f-40b4-ac47-476bfa3cf3f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/5215c26e-be2f-40b4-ac47-476bfa3cf3f2"}]}, "flavor": {"id": "53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac"}]}, "created": "2025-11-25T06:19:37Z", "updated": "2025-11-25T06:19:57Z", "addresses": {"tempest-network-smoke--54157590": [{"version": 4, "addr": "10.100.0.30", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:a6:18:bc"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/44e7315b-3c8f-4079-8553-02a6fd6f107d"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/44e7315b-3c8f-4079-8553-02a6fd6f107d"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-348265766", "OS-SRV-USG:launched_at": "2025-11-25T06:19:57.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-1776066085"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000002", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.445 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/44e7315b-3c8f-4079-8553-02a6fd6f107d used request id req-4d5e6821-7abc-4cf4-a745-5890619125bf request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.446 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '44e7315b-3c8f-4079-8553-02a6fd6f107d', 'name': 'tempest-TestNetworkBasicOps-server-953490707', 'flavor': {'id': '53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd90b557db9104ecfb816b1cdab8712bd', 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'hostId': 'd6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.446 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.446 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.447 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.447 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.447 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2025-11-25T06:20:02.447147) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.459 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/memory.usage volume: 46.4765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.469 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 16 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 44e7315b-3c8f-4079-8553-02a6fd6f107d: ceilometer.compute.pollsters.NoVolumeException
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.470 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2025-11-25T06:20:02.470782) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.472 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4b7f1c44-c36c-4ce9-b498-4984df4111b3 / tapb0597686-1f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.472 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.473 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 44e7315b-3c8f-4079-8553-02a6fd6f107d / tap8679bd61-00 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2025-11-25T06:20:02.474828) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.474 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.outgoing.packets volume: 134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.475 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2025-11-25T06:20:02.475872) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.491 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.write.latency volume: 362405397 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.492 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.509 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.509 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.509 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.509 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.510 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.510 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.510 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.510 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.510 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2025-11-25T06:20:02.510237) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.516 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.516 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.523 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.523 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.523 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.523 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.523 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2025-11-25T06:20:02.524147) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.read.requests volume: 1073 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.524 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2025-11-25T06:20:02.525688) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.525 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.read.bytes volume: 29886976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.526 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.526 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.526 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.526 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.526 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.526 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2025-11-25T06:20:02.527162) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.527 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.write.bytes volume: 73003008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2025-11-25T06:20:02.528209) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.528 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2025-11-25T06:20:02.529652) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.529 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2025-11-25T06:20:02.530697) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.530 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2025-11-25T06:20:02.531712) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.531 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.read.latency volume: 183773718 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.532 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.read.latency volume: 105771993 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.532 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.read.latency volume: 144528056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.532 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.read.latency volume: 3636615 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.532 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.533 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.533 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.533 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.533 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.533 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.533 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2025-11-25T06:20:02.533320) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2025-11-25T06:20:02.534362) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.534 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1724422885>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-953490707>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1724422885>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-953490707>]
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2025-11-25T06:20:02.535292) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1724422885>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-953490707>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1724422885>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-953490707>]
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.535 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.incoming.packets.drop volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2025-11-25T06:20:02.535979) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.536 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/cpu volume: 9620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/cpu volume: 5720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2025-11-25T06:20:02.537034) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.537 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.538 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.538 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.incoming.bytes volume: 25104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.538 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.538 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.538 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.538 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.538 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2025-11-25T06:20:02.538064) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.539 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.540 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.540 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.540 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.540 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.540 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2025-11-25T06:20:02.539085) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.540 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.540 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2025-11-25T06:20:02.540136) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2025-11-25T06:20:02.541661) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.541 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.incoming.packets volume: 125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.542 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2025-11-25T06:20:02.542697) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.write.requests volume: 333 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.543 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2025-11-25T06:20:02.543553) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.544 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.544 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.544 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.544 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.544 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.544 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.544 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2025-11-25T06:20:02.545025) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.545 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/network.outgoing.bytes volume: 19832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2025-11-25T06:20:02.546068) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.546 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.547 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.547 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.547 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.547 16 DEBUG ceilometer.compute.pollsters [-] 4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.547 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.547 16 DEBUG ceilometer.compute.pollsters [-] 44e7315b-3c8f-4079-8553-02a6fd6f107d/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.548 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Nov 25 06:20:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:20:02.548 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2025-11-25T06:20:02.547082) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:20:05 compute-0 podman[212343]: 2025-11-25 06:20:05.063834185 +0000 UTC m=+0.041988433 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:20:06 compute-0 nova_compute[186241]: 2025-11-25 06:20:06.886 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:06 compute-0 ovn_controller[95135]: 2025-11-25T06:20:06Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a6:18:bc 10.100.0.30
Nov 25 06:20:06 compute-0 ovn_controller[95135]: 2025-11-25T06:20:06Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a6:18:bc 10.100.0.30
Nov 25 06:20:07 compute-0 nova_compute[186241]: 2025-11-25 06:20:07.155 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:11 compute-0 podman[212369]: 2025-11-25 06:20:11.059087476 +0000 UTC m=+0.038296864 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:20:11 compute-0 nova_compute[186241]: 2025-11-25 06:20:11.887 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:12 compute-0 nova_compute[186241]: 2025-11-25 06:20:12.157 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:14 compute-0 nova_compute[186241]: 2025-11-25 06:20:14.969 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:14 compute-0 nova_compute[186241]: 2025-11-25 06:20:14.969 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:14 compute-0 nova_compute[186241]: 2025-11-25 06:20:14.970 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:14 compute-0 nova_compute[186241]: 2025-11-25 06:20:14.970 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:14 compute-0 nova_compute[186241]: 2025-11-25 06:20:14.970 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:14 compute-0 nova_compute[186241]: 2025-11-25 06:20:14.971 186245 INFO nova.compute.manager [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Terminating instance
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.474 186245 DEBUG nova.compute.manager [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:20:15 compute-0 kernel: tap8679bd61-00 (unregistering): left promiscuous mode
Nov 25 06:20:15 compute-0 NetworkManager[55345]: <info>  [1764051615.4978] device (tap8679bd61-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:20:15 compute-0 ovn_controller[95135]: 2025-11-25T06:20:15Z|00051|binding|INFO|Releasing lport 8679bd61-0016-49b1-a137-221904386339 from this chassis (sb_readonly=0)
Nov 25 06:20:15 compute-0 ovn_controller[95135]: 2025-11-25T06:20:15Z|00052|binding|INFO|Setting lport 8679bd61-0016-49b1-a137-221904386339 down in Southbound
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.503 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:15 compute-0 ovn_controller[95135]: 2025-11-25T06:20:15Z|00053|binding|INFO|Removing iface tap8679bd61-00 ovn-installed in OVS
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.506 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.510 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:18:bc 10.100.0.30'], port_security=['fa:16:3e:a6:18:bc 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': '44e7315b-3c8f-4079-8553-02a6fd6f107d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2944a666-502c-4b4a-991f-0b2eb4d34ba8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62086cc8-0f51-42bb-a9b0-6996044ab0f9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=8679bd61-0016-49b1-a137-221904386339) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.511 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 8679bd61-0016-49b1-a137-221904386339 in datapath 622ce19f-960f-4b6d-93c3-22c8073dbf77 unbound from our chassis
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.512 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 622ce19f-960f-4b6d-93c3-22c8073dbf77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.513 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[cd839456-ab81-4c4a-93ae-b0f0662ed253]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.513 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77 namespace which is not needed anymore
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.518 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:15 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 25 06:20:15 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 10.528s CPU time.
Nov 25 06:20:15 compute-0 systemd-machined[152921]: Machine qemu-2-instance-00000002 terminated.
Nov 25 06:20:15 compute-0 neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77[212292]: [NOTICE]   (212309) : haproxy version is 2.8.14-c23fe91
Nov 25 06:20:15 compute-0 neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77[212292]: [NOTICE]   (212309) : path to executable is /usr/sbin/haproxy
Nov 25 06:20:15 compute-0 neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77[212292]: [WARNING]  (212309) : Exiting Master process...
Nov 25 06:20:15 compute-0 podman[212410]: 2025-11-25 06:20:15.598279405 +0000 UTC m=+0.019635404 container kill e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 25 06:20:15 compute-0 neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77[212292]: [ALERT]    (212309) : Current worker (212313) exited with code 143 (Terminated)
Nov 25 06:20:15 compute-0 neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77[212292]: [WARNING]  (212309) : All workers exited. Exiting... (0)
Nov 25 06:20:15 compute-0 systemd[1]: libpod-e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d.scope: Deactivated successfully.
Nov 25 06:20:15 compute-0 conmon[212292]: conmon e278a9c82c7477028537 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d.scope/container/memory.events
Nov 25 06:20:15 compute-0 podman[212421]: 2025-11-25 06:20:15.633045903 +0000 UTC m=+0.019242956 container died e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 06:20:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d-userdata-shm.mount: Deactivated successfully.
Nov 25 06:20:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-afda3b8ae862b9a6c0f05633a9a00ef0d28360810a58f1f83cf4225e71162aeb-merged.mount: Deactivated successfully.
Nov 25 06:20:15 compute-0 podman[212421]: 2025-11-25 06:20:15.653558431 +0000 UTC m=+0.039755475 container cleanup e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:20:15 compute-0 systemd[1]: libpod-conmon-e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d.scope: Deactivated successfully.
Nov 25 06:20:15 compute-0 podman[212423]: 2025-11-25 06:20:15.663145985 +0000 UTC m=+0.044909990 container remove e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.666 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a0307bef-2592-408f-983a-74d0b1bbaafd]: (4, ("Tue Nov 25 06:20:15 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77 (e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d)\ne278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d\nTue Nov 25 06:20:15 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77 (e278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d)\ne278a9c82c7477028537e1fe17317bf00e05b42e8292f239fe73637588edf85d\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.668 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7525497c-1ea7-4431-9132-f2bdfed80101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.668 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/622ce19f-960f-4b6d-93c3-22c8073dbf77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.668 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bf522281-ec0c-4b0c-803c-209497672b2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.669 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap622ce19f-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.671 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:15 compute-0 kernel: tap622ce19f-90: left promiscuous mode
Nov 25 06:20:15 compute-0 NetworkManager[55345]: <info>  [1764051615.6861] manager: (tap8679bd61-00): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.686 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.689 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4cad54-5c8d-4b61-a92a-26793ec6f04b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.697 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1a5c36-1d60-4a53-8a80-f59e8753f470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.698 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[23d0691d-87cc-40b4-8dd5-bc1425b035d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.713 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8020b749-d5e3-42ae-953d-901bd4bd8b5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 258403, 'reachable_time': 27424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212460, 'error': None, 'target': 'ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.716 186245 INFO nova.virt.libvirt.driver [-] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Instance destroyed successfully.
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.717 186245 DEBUG nova.objects.instance [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 44e7315b-3c8f-4079-8553-02a6fd6f107d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:20:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d622ce19f\x2d960f\x2d4b6d\x2d93c3\x2d22c8073dbf77.mount: Deactivated successfully.
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.722 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-622ce19f-960f-4b6d-93c3-22c8073dbf77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:20:15 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:15.723 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[29fb995d-bfed-448d-b0a6-490b5e21b491]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.794 186245 DEBUG nova.compute.manager [req-917ee96e-2248-4890-83a0-5a58f1294e9c req-33682a7b-5193-4e46-9fc7-fa16d38371d9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received event network-vif-unplugged-8679bd61-0016-49b1-a137-221904386339 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.794 186245 DEBUG oslo_concurrency.lockutils [req-917ee96e-2248-4890-83a0-5a58f1294e9c req-33682a7b-5193-4e46-9fc7-fa16d38371d9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.794 186245 DEBUG oslo_concurrency.lockutils [req-917ee96e-2248-4890-83a0-5a58f1294e9c req-33682a7b-5193-4e46-9fc7-fa16d38371d9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.794 186245 DEBUG oslo_concurrency.lockutils [req-917ee96e-2248-4890-83a0-5a58f1294e9c req-33682a7b-5193-4e46-9fc7-fa16d38371d9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.795 186245 DEBUG nova.compute.manager [req-917ee96e-2248-4890-83a0-5a58f1294e9c req-33682a7b-5193-4e46-9fc7-fa16d38371d9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] No waiting events found dispatching network-vif-unplugged-8679bd61-0016-49b1-a137-221904386339 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.795 186245 DEBUG nova.compute.manager [req-917ee96e-2248-4890-83a0-5a58f1294e9c req-33682a7b-5193-4e46-9fc7-fa16d38371d9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received event network-vif-unplugged-8679bd61-0016-49b1-a137-221904386339 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:15 compute-0 nova_compute[186241]: 2025-11-25 06:20:15.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.220 186245 DEBUG nova.virt.libvirt.vif [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:19:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-953490707',display_name='tempest-TestNetworkBasicOps-server-953490707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-953490707',id=2,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+Cg1gQgYMJkyvh+3AQUwPVx+19d/oRZHSGYziiVMz+/V6NCZoJWvcWk5srLg/9x/dqcPR0nplDGGiH3SNDL8e2kvlFYD480J7OURsqH5WOEs4u+Wwm+3s34NJcZa9xkQ==',key_name='tempest-TestNetworkBasicOps-348265766',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:19:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-yl9ehua5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:19:57Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=44e7315b-3c8f-4079-8553-02a6fd6f107d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.221 186245 DEBUG nova.network.os_vif_util [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "8679bd61-0016-49b1-a137-221904386339", "address": "fa:16:3e:a6:18:bc", "network": {"id": "622ce19f-960f-4b6d-93c3-22c8073dbf77", "bridge": "br-int", "label": "tempest-network-smoke--54157590", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8679bd61-00", "ovs_interfaceid": "8679bd61-0016-49b1-a137-221904386339", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.222 186245 DEBUG nova.network.os_vif_util [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:18:bc,bridge_name='br-int',has_traffic_filtering=True,id=8679bd61-0016-49b1-a137-221904386339,network=Network(622ce19f-960f-4b6d-93c3-22c8073dbf77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8679bd61-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.222 186245 DEBUG os_vif [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:18:bc,bridge_name='br-int',has_traffic_filtering=True,id=8679bd61-0016-49b1-a137-221904386339,network=Network(622ce19f-960f-4b6d-93c3-22c8073dbf77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8679bd61-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.224 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.224 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8679bd61-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.225 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.226 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.227 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.227 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b4096eb3-fff6-4542-8db4-14e004811610) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.228 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.228 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.230 186245 INFO os_vif [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:18:bc,bridge_name='br-int',has_traffic_filtering=True,id=8679bd61-0016-49b1-a137-221904386339,network=Network(622ce19f-960f-4b6d-93c3-22c8073dbf77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8679bd61-00')
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.230 186245 INFO nova.virt.libvirt.driver [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Deleting instance files /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d_del
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.231 186245 INFO nova.virt.libvirt.driver [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Deletion of /var/lib/nova/instances/44e7315b-3c8f-4079-8553-02a6fd6f107d_del complete
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.434 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.435 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.435 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.738 186245 INFO nova.compute.manager [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.738 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.739 186245 DEBUG nova.compute.manager [-] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.739 186245 DEBUG nova.network.neutron [-] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.888 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:16 compute-0 nova_compute[186241]: 2025-11-25 06:20:16.938 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.947 186245 DEBUG nova.compute.manager [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received event network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.947 186245 DEBUG oslo_concurrency.lockutils [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.947 186245 DEBUG oslo_concurrency.lockutils [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.948 186245 DEBUG oslo_concurrency.lockutils [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.948 186245 DEBUG nova.compute.manager [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] No waiting events found dispatching network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.948 186245 WARNING nova.compute.manager [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received unexpected event network-vif-plugged-8679bd61-0016-49b1-a137-221904386339 for instance with vm_state active and task_state deleting.
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.948 186245 DEBUG nova.compute.manager [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Received event network-vif-deleted-8679bd61-0016-49b1-a137-221904386339 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.948 186245 INFO nova.compute.manager [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Neutron deleted interface 8679bd61-0016-49b1-a137-221904386339; detaching it from the instance and deleting it from the info cache
Nov 25 06:20:17 compute-0 nova_compute[186241]: 2025-11-25 06:20:17.948 186245 DEBUG nova.network.neutron [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:20:18 compute-0 nova_compute[186241]: 2025-11-25 06:20:18.045 186245 DEBUG nova.network.neutron [-] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:20:18 compute-0 podman[212469]: 2025-11-25 06:20:18.085018067 +0000 UTC m=+0.062737416 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Nov 25 06:20:18 compute-0 nova_compute[186241]: 2025-11-25 06:20:18.453 186245 DEBUG nova.compute.manager [req-2fb1054b-13e5-48c2-9f97-957ad1597d83 req-b9e9ce1d-76ab-4b6e-90d2-ac91f43a608e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Detach interface failed, port_id=8679bd61-0016-49b1-a137-221904386339, reason: Instance 44e7315b-3c8f-4079-8553-02a6fd6f107d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:20:18 compute-0 nova_compute[186241]: 2025-11-25 06:20:18.547 186245 INFO nova.compute.manager [-] [instance: 44e7315b-3c8f-4079-8553-02a6fd6f107d] Took 1.81 seconds to deallocate network for instance.
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.053 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.053 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.116 186245 DEBUG nova.compute.provider_tree [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.441 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.441 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.441 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.619 186245 DEBUG nova.scheduler.client.report [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.928 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:19 compute-0 nova_compute[186241]: 2025-11-25 06:20:19.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:20 compute-0 nova_compute[186241]: 2025-11-25 06:20:20.125 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:20 compute-0 nova_compute[186241]: 2025-11-25 06:20:20.142 186245 INFO nova.scheduler.client.report [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 44e7315b-3c8f-4079-8553-02a6fd6f107d
Nov 25 06:20:20 compute-0 nova_compute[186241]: 2025-11-25 06:20:20.438 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:20 compute-0 nova_compute[186241]: 2025-11-25 06:20:20.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:20 compute-0 nova_compute[186241]: 2025-11-25 06:20:20.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:20 compute-0 nova_compute[186241]: 2025-11-25 06:20:20.439 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.150 186245 DEBUG oslo_concurrency.lockutils [None req-8f032175-74c6-4c4c-9bfd-153eb4e69363 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "44e7315b-3c8f-4079-8553-02a6fd6f107d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.228 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.462 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.505 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.506 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.548 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.725 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.726 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5625MB free_disk=72.99246597290039GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.726 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.726 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:21 compute-0 nova_compute[186241]: 2025-11-25 06:20:21.890 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:22 compute-0 podman[212500]: 2025-11-25 06:20:22.059633514 +0000 UTC m=+0.038512780 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:20:22 compute-0 podman[212499]: 2025-11-25 06:20:22.060091387 +0000 UTC m=+0.040011336 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 06:20:22 compute-0 nova_compute[186241]: 2025-11-25 06:20:22.752 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 4b7f1c44-c36c-4ce9-b498-4984df4111b3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:20:22 compute-0 nova_compute[186241]: 2025-11-25 06:20:22.753 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:20:22 compute-0 nova_compute[186241]: 2025-11-25 06:20:22.753 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:20:22 compute-0 nova_compute[186241]: 2025-11-25 06:20:22.798 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:20:23 compute-0 nova_compute[186241]: 2025-11-25 06:20:23.301 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:20:23 compute-0 nova_compute[186241]: 2025-11-25 06:20:23.807 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:20:23 compute-0 nova_compute[186241]: 2025-11-25 06:20:23.807 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:24 compute-0 ovn_controller[95135]: 2025-11-25T06:20:24Z|00054|binding|INFO|Releasing lport 41d1b45c-dacf-4079-b06f-ab644147f8e7 from this chassis (sb_readonly=0)
Nov 25 06:20:24 compute-0 nova_compute[186241]: 2025-11-25 06:20:24.596 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:24 compute-0 nova_compute[186241]: 2025-11-25 06:20:24.807 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:20:24 compute-0 nova_compute[186241]: 2025-11-25 06:20:24.808 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:20:25 compute-0 nova_compute[186241]: 2025-11-25 06:20:25.732 186245 DEBUG nova.compute.manager [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-changed-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:20:25 compute-0 nova_compute[186241]: 2025-11-25 06:20:25.732 186245 DEBUG nova.compute.manager [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Refreshing instance network info cache due to event network-changed-b0597686-1f09-4b3d-ad11-27d3fbbdde6c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:20:25 compute-0 nova_compute[186241]: 2025-11-25 06:20:25.733 186245 DEBUG oslo_concurrency.lockutils [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:20:25 compute-0 nova_compute[186241]: 2025-11-25 06:20:25.733 186245 DEBUG oslo_concurrency.lockutils [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:20:25 compute-0 nova_compute[186241]: 2025-11-25 06:20:25.733 186245 DEBUG nova.network.neutron [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Refreshing network info cache for port b0597686-1f09-4b3d-ad11-27d3fbbdde6c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.230 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.275 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.276 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.276 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.276 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.277 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.277 186245 INFO nova.compute.manager [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Terminating instance
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.781 186245 DEBUG nova.compute.manager [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:20:26 compute-0 kernel: tapb0597686-1f (unregistering): left promiscuous mode
Nov 25 06:20:26 compute-0 NetworkManager[55345]: <info>  [1764051626.8064] device (tapb0597686-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:20:26 compute-0 ovn_controller[95135]: 2025-11-25T06:20:26Z|00055|binding|INFO|Releasing lport b0597686-1f09-4b3d-ad11-27d3fbbdde6c from this chassis (sb_readonly=0)
Nov 25 06:20:26 compute-0 ovn_controller[95135]: 2025-11-25T06:20:26Z|00056|binding|INFO|Setting lport b0597686-1f09-4b3d-ad11-27d3fbbdde6c down in Southbound
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.810 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:26 compute-0 ovn_controller[95135]: 2025-11-25T06:20:26Z|00057|binding|INFO|Removing iface tapb0597686-1f ovn-installed in OVS
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.812 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.815 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:c9:8b 10.100.0.12'], port_security=['fa:16:3e:cd:c9:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4b7f1c44-c36c-4ce9-b498-4984df4111b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a247d86c-48ea-4aa8-9306-726933f4704f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6fb2701-3644-4fc1-81d0-634fa89abf62, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=b0597686-1f09-4b3d-ad11-27d3fbbdde6c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.816 103953 INFO neutron.agent.ovn.metadata.agent [-] Port b0597686-1f09-4b3d-ad11-27d3fbbdde6c in datapath 0e4c5e99-aead-49a3-910e-5959edf0d03a unbound from our chassis
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.817 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e4c5e99-aead-49a3-910e-5959edf0d03a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.818 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdeeb03-46a7-49ba-b0ef-7ff9193ce238]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.818 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a namespace which is not needed anymore
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.835 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:26 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 25 06:20:26 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 11.898s CPU time.
Nov 25 06:20:26 compute-0 systemd-machined[152921]: Machine qemu-1-instance-00000001 terminated.
Nov 25 06:20:26 compute-0 podman[212538]: 2025-11-25 06:20:26.868078557 +0000 UTC m=+0.046812637 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.889 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:26 compute-0 neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a[211938]: [NOTICE]   (211953) : haproxy version is 2.8.14-c23fe91
Nov 25 06:20:26 compute-0 neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a[211938]: [NOTICE]   (211953) : path to executable is /usr/sbin/haproxy
Nov 25 06:20:26 compute-0 neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a[211938]: [WARNING]  (211953) : Exiting Master process...
Nov 25 06:20:26 compute-0 podman[212575]: 2025-11-25 06:20:26.908806482 +0000 UTC m=+0.020284078 container kill 7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 25 06:20:26 compute-0 neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a[211938]: [ALERT]    (211953) : Current worker (211955) exited with code 143 (Terminated)
Nov 25 06:20:26 compute-0 neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a[211938]: [WARNING]  (211953) : All workers exited. Exiting... (0)
Nov 25 06:20:26 compute-0 systemd[1]: libpod-7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94.scope: Deactivated successfully.
Nov 25 06:20:26 compute-0 podman[212587]: 2025-11-25 06:20:26.939534848 +0000 UTC m=+0.016416918 container died 7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 06:20:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94-userdata-shm.mount: Deactivated successfully.
Nov 25 06:20:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-ec5fce7cd471331274e23735b35264bec29183e6ccaeb4597dc20ca064d894fe-merged.mount: Deactivated successfully.
Nov 25 06:20:26 compute-0 podman[212587]: 2025-11-25 06:20:26.956905022 +0000 UTC m=+0.033787093 container cleanup 7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:20:26 compute-0 systemd[1]: libpod-conmon-7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94.scope: Deactivated successfully.
Nov 25 06:20:26 compute-0 podman[212588]: 2025-11-25 06:20:26.966753527 +0000 UTC m=+0.041419740 container remove 7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.980 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ecc2f4-f67e-43ae-a1c0-98ba0d45e9d4]: (4, ("Tue Nov 25 06:20:26 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a (7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94)\n7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94\nTue Nov 25 06:20:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a (7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94)\n7c6d9159024afb18230cacaa4ceae3ec41676f15af73b6a60376a4d381a08e94\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.981 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[16372ef4-3372-4440-a5af-f3aa69b4d4e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.981 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e4c5e99-aead-49a3-910e-5959edf0d03a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.982 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[673992ee-930f-46f2-b216-a10620e8cacd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:26.982 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e4c5e99-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:20:26 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.983 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:26 compute-0 kernel: tap0e4c5e99-a0: left promiscuous mode
Nov 25 06:20:26 compute-0 NetworkManager[55345]: <info>  [1764051626.9992] manager: (tapb0597686-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.999 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:26.999 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:27.002 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1d783c4e-838f-4bf3-a1ec-5c7ae2f84e54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:27 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:27.014 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[04daf725-4a32-434b-b19e-dcac36ce2f94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:27 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:27.015 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1679a7-2eda-4054-9dad-931e99eaabf6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.022 186245 INFO nova.virt.libvirt.driver [-] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Instance destroyed successfully.
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.023 186245 DEBUG nova.objects.instance [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 4b7f1c44-c36c-4ce9-b498-4984df4111b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:20:27 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:27.027 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[956cff7c-9a9b-4654-92e1-c4020f08ed82]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 253718, 'reachable_time': 18785, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212625, 'error': None, 'target': 'ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:27 compute-0 systemd[1]: run-netns-ovnmeta\x2d0e4c5e99\x2daead\x2d49a3\x2d910e\x2d5959edf0d03a.mount: Deactivated successfully.
Nov 25 06:20:27 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:27.028 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0e4c5e99-aead-49a3-910e-5959edf0d03a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:20:27 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:27.028 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[a6765fd1-9d53-433d-a3e7-d1930eb1b3ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.526 186245 DEBUG nova.virt.libvirt.vif [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1724422885',display_name='tempest-TestNetworkBasicOps-server-1724422885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1724422885',id=1,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw9bVqco8U0Wp4LUSfq3rP2Y3G4d9+HrSr18rZe33vIbLU5CgQl8aOPgkakXGAmX+BqLFeIZmSSNp4FTfanwb1Zj2Pr3fHYFwGKmsMT3gYm+uRVfIhQGs0huoytbWm/+w==',key_name='tempest-TestNetworkBasicOps-1956672502',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:19:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-otdipvg7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:19:08Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=4b7f1c44-c36c-4ce9-b498-4984df4111b3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.527 186245 DEBUG nova.network.os_vif_util [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.527 186245 DEBUG nova.network.os_vif_util [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cd:c9:8b,bridge_name='br-int',has_traffic_filtering=True,id=b0597686-1f09-4b3d-ad11-27d3fbbdde6c,network=Network(0e4c5e99-aead-49a3-910e-5959edf0d03a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0597686-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.528 186245 DEBUG os_vif [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:c9:8b,bridge_name='br-int',has_traffic_filtering=True,id=b0597686-1f09-4b3d-ad11-27d3fbbdde6c,network=Network(0e4c5e99-aead-49a3-910e-5959edf0d03a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0597686-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.529 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.529 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0597686-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.530 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.531 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.532 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.532 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=56e8b9e0-776d-48c7-890b-95c0dca982d3) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.532 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.533 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.534 186245 INFO os_vif [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:c9:8b,bridge_name='br-int',has_traffic_filtering=True,id=b0597686-1f09-4b3d-ad11-27d3fbbdde6c,network=Network(0e4c5e99-aead-49a3-910e-5959edf0d03a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0597686-1f')
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.535 186245 INFO nova.virt.libvirt.driver [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Deleting instance files /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3_del
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.535 186245 INFO nova.virt.libvirt.driver [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Deletion of /var/lib/nova/instances/4b7f1c44-c36c-4ce9-b498-4984df4111b3_del complete
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.869 186245 DEBUG nova.compute.manager [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-vif-unplugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.869 186245 DEBUG oslo_concurrency.lockutils [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.869 186245 DEBUG oslo_concurrency.lockutils [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.869 186245 DEBUG oslo_concurrency.lockutils [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.869 186245 DEBUG nova.compute.manager [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] No waiting events found dispatching network-vif-unplugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.869 186245 DEBUG nova.compute.manager [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-vif-unplugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.870 186245 DEBUG nova.compute.manager [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.870 186245 DEBUG oslo_concurrency.lockutils [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.870 186245 DEBUG oslo_concurrency.lockutils [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.870 186245 DEBUG oslo_concurrency.lockutils [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.870 186245 DEBUG nova.compute.manager [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] No waiting events found dispatching network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:20:27 compute-0 nova_compute[186241]: 2025-11-25 06:20:27.870 186245 WARNING nova.compute.manager [req-da9ad76d-e18c-4128-bf5b-b890ad79b68b req-b259266c-7c10-424b-8c51-69ec0cb42176 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received unexpected event network-vif-plugged-b0597686-1f09-4b3d-ad11-27d3fbbdde6c for instance with vm_state active and task_state deleting.
Nov 25 06:20:28 compute-0 nova_compute[186241]: 2025-11-25 06:20:28.041 186245 INFO nova.compute.manager [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:20:28 compute-0 nova_compute[186241]: 2025-11-25 06:20:28.041 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:20:28 compute-0 nova_compute[186241]: 2025-11-25 06:20:28.042 186245 DEBUG nova.compute.manager [-] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:20:28 compute-0 nova_compute[186241]: 2025-11-25 06:20:28.042 186245 DEBUG nova.network.neutron [-] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:20:28 compute-0 nova_compute[186241]: 2025-11-25 06:20:28.385 186245 DEBUG nova.network.neutron [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updated VIF entry in instance network info cache for port b0597686-1f09-4b3d-ad11-27d3fbbdde6c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:20:28 compute-0 nova_compute[186241]: 2025-11-25 06:20:28.386 186245 DEBUG nova.network.neutron [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updating instance_info_cache with network_info: [{"id": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "address": "fa:16:3e:cd:c9:8b", "network": {"id": "0e4c5e99-aead-49a3-910e-5959edf0d03a", "bridge": "br-int", "label": "tempest-network-smoke--433373947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0597686-1f", "ovs_interfaceid": "b0597686-1f09-4b3d-ad11-27d3fbbdde6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:20:28 compute-0 nova_compute[186241]: 2025-11-25 06:20:28.888 186245 DEBUG oslo_concurrency.lockutils [req-78d6c225-54ad-4f32-b8f0-3f4e9fe0350e req-f7f64f71-32c2-47e1-ac87-923dcdaa5823 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-4b7f1c44-c36c-4ce9-b498-4984df4111b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:20:29 compute-0 nova_compute[186241]: 2025-11-25 06:20:29.255 186245 DEBUG nova.network.neutron [-] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:20:29 compute-0 nova_compute[186241]: 2025-11-25 06:20:29.338 186245 DEBUG nova.compute.manager [req-5f8a4657-a02b-4b18-8bc0-be770ddccdaf req-3d72a120-560a-4aa0-9940-c14475babf3b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Received event network-vif-deleted-b0597686-1f09-4b3d-ad11-27d3fbbdde6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:20:29 compute-0 nova_compute[186241]: 2025-11-25 06:20:29.758 186245 INFO nova.compute.manager [-] [instance: 4b7f1c44-c36c-4ce9-b498-4984df4111b3] Took 1.72 seconds to deallocate network for instance.
Nov 25 06:20:30 compute-0 nova_compute[186241]: 2025-11-25 06:20:30.263 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:30 compute-0 nova_compute[186241]: 2025-11-25 06:20:30.264 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:30 compute-0 nova_compute[186241]: 2025-11-25 06:20:30.609 186245 DEBUG nova.compute.provider_tree [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:20:31 compute-0 podman[212629]: 2025-11-25 06:20:31.057919322 +0000 UTC m=+0.037582307 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible)
Nov 25 06:20:31 compute-0 nova_compute[186241]: 2025-11-25 06:20:31.112 186245 DEBUG nova.scheduler.client.report [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:20:31 compute-0 nova_compute[186241]: 2025-11-25 06:20:31.617 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:31 compute-0 nova_compute[186241]: 2025-11-25 06:20:31.778 186245 INFO nova.scheduler.client.report [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 4b7f1c44-c36c-4ce9-b498-4984df4111b3
Nov 25 06:20:31 compute-0 nova_compute[186241]: 2025-11-25 06:20:31.891 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:32 compute-0 nova_compute[186241]: 2025-11-25 06:20:32.533 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:32 compute-0 nova_compute[186241]: 2025-11-25 06:20:32.786 186245 DEBUG oslo_concurrency.lockutils [None req-830d6d22-17e8-4ed4-bf88-858ca7100c6d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "4b7f1c44-c36c-4ce9-b498-4984df4111b3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:33.185 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:20:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:33.186 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:20:33 compute-0 nova_compute[186241]: 2025-11-25 06:20:33.186 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:36 compute-0 podman[212649]: 2025-11-25 06:20:36.063069089 +0000 UTC m=+0.043394855 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 06:20:36 compute-0 nova_compute[186241]: 2025-11-25 06:20:36.892 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:37 compute-0 nova_compute[186241]: 2025-11-25 06:20:37.416 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:37 compute-0 nova_compute[186241]: 2025-11-25 06:20:37.492 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:37 compute-0 nova_compute[186241]: 2025-11-25 06:20:37.533 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:41 compute-0 nova_compute[186241]: 2025-11-25 06:20:41.893 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:42 compute-0 podman[212667]: 2025-11-25 06:20:42.05413694 +0000 UTC m=+0.035684498 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:20:42 compute-0 nova_compute[186241]: 2025-11-25 06:20:42.535 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:42 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:42.648 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:08:c7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad33700d-cdbb-45fc-843f-b6325c07b4bf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=78d6d1f7-b362-45aa-9257-0f3da31c1b09) old=Port_Binding(mac=['fa:16:3e:f3:08:c7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:20:42 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:42.648 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 78d6d1f7-b362-45aa-9257-0f3da31c1b09 in datapath 48e22ff7-b3ad-4c32-9660-e2abd8947790 updated
Nov 25 06:20:42 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:42.649 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48e22ff7-b3ad-4c32-9660-e2abd8947790, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:20:42 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:42.650 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bf969a7f-6b2a-4ed4-b42f-9ef62bc593bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:20:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:43.187 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:20:46 compute-0 nova_compute[186241]: 2025-11-25 06:20:46.895 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:47.343 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:47.344 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:20:47.344 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:20:47 compute-0 nova_compute[186241]: 2025-11-25 06:20:47.536 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:49 compute-0 podman[212690]: 2025-11-25 06:20:49.099898569 +0000 UTC m=+0.077675251 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 25 06:20:51 compute-0 nova_compute[186241]: 2025-11-25 06:20:51.897 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:52 compute-0 nova_compute[186241]: 2025-11-25 06:20:52.537 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:53 compute-0 podman[212713]: 2025-11-25 06:20:53.062658547 +0000 UTC m=+0.038775419 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 06:20:53 compute-0 podman[212714]: 2025-11-25 06:20:53.086932097 +0000 UTC m=+0.061701426 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:20:56 compute-0 nova_compute[186241]: 2025-11-25 06:20:56.898 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:57 compute-0 podman[212751]: 2025-11-25 06:20:57.053831269 +0000 UTC m=+0.034094729 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 06:20:57 compute-0 nova_compute[186241]: 2025-11-25 06:20:57.539 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:20:59 compute-0 nova_compute[186241]: 2025-11-25 06:20:59.314 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:20:59 compute-0 nova_compute[186241]: 2025-11-25 06:20:59.314 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:20:59 compute-0 nova_compute[186241]: 2025-11-25 06:20:59.817 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:21:00 compute-0 nova_compute[186241]: 2025-11-25 06:21:00.343 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:00 compute-0 nova_compute[186241]: 2025-11-25 06:21:00.343 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:00 compute-0 nova_compute[186241]: 2025-11-25 06:21:00.349 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:21:00 compute-0 nova_compute[186241]: 2025-11-25 06:21:00.349 186245 INFO nova.compute.claims [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:21:01 compute-0 nova_compute[186241]: 2025-11-25 06:21:01.389 186245 DEBUG nova.compute.provider_tree [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:21:01 compute-0 nova_compute[186241]: 2025-11-25 06:21:01.893 186245 DEBUG nova.scheduler.client.report [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:21:01 compute-0 nova_compute[186241]: 2025-11-25 06:21:01.900 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:02 compute-0 podman[212767]: 2025-11-25 06:21:02.084096199 +0000 UTC m=+0.050631001 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 06:21:02 compute-0 nova_compute[186241]: 2025-11-25 06:21:02.398 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:02 compute-0 nova_compute[186241]: 2025-11-25 06:21:02.398 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:21:02 compute-0 nova_compute[186241]: 2025-11-25 06:21:02.540 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:02 compute-0 nova_compute[186241]: 2025-11-25 06:21:02.904 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:21:02 compute-0 nova_compute[186241]: 2025-11-25 06:21:02.904 186245 DEBUG nova.network.neutron [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:21:03 compute-0 nova_compute[186241]: 2025-11-25 06:21:03.221 186245 DEBUG nova.policy [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:21:03 compute-0 nova_compute[186241]: 2025-11-25 06:21:03.409 186245 INFO nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:21:03 compute-0 nova_compute[186241]: 2025-11-25 06:21:03.913 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.472 186245 DEBUG nova.network.neutron [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Successfully created port: 7704ac5e-d3f5-484e-b018-096af3d84408 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.923 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.924 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.925 186245 INFO nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Creating image(s)
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.925 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.925 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.926 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.927 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.930 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.931 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.975 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.976 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.977 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.977 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.980 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:21:04 compute-0 nova_compute[186241]: 2025-11-25 06:21:04.980 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.025 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.026 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.049 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.049 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.050 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.095 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.096 186245 DEBUG nova.virt.disk.api [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.096 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.141 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.141 186245 DEBUG nova.virt.disk.api [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.142 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.142 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Ensure instance console log exists: /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.142 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.143 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.143 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.331 186245 DEBUG nova.network.neutron [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Successfully updated port: 7704ac5e-d3f5-484e-b018-096af3d84408 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.511 186245 DEBUG nova.compute.manager [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-changed-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.511 186245 DEBUG nova.compute.manager [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing instance network info cache due to event network-changed-7704ac5e-d3f5-484e-b018-096af3d84408. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.511 186245 DEBUG oslo_concurrency.lockutils [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.511 186245 DEBUG oslo_concurrency.lockutils [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.512 186245 DEBUG nova.network.neutron [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing network info cache for port 7704ac5e-d3f5-484e-b018-096af3d84408 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:21:05 compute-0 nova_compute[186241]: 2025-11-25 06:21:05.834 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:21:06 compute-0 nova_compute[186241]: 2025-11-25 06:21:06.900 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:07 compute-0 podman[212800]: 2025-11-25 06:21:07.072043411 +0000 UTC m=+0.044841455 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 06:21:07 compute-0 nova_compute[186241]: 2025-11-25 06:21:07.149 186245 DEBUG nova.network.neutron [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:21:07 compute-0 nova_compute[186241]: 2025-11-25 06:21:07.541 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:08 compute-0 nova_compute[186241]: 2025-11-25 06:21:08.378 186245 DEBUG nova.network.neutron [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:21:08 compute-0 nova_compute[186241]: 2025-11-25 06:21:08.882 186245 DEBUG oslo_concurrency.lockutils [req-008264c9-b176-4fb8-bb14-e26deed01233 req-38c2cded-e14d-4194-8c0e-5146c4f1a989 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:21:08 compute-0 nova_compute[186241]: 2025-11-25 06:21:08.882 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:21:08 compute-0 nova_compute[186241]: 2025-11-25 06:21:08.882 186245 DEBUG nova.network.neutron [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:21:10 compute-0 nova_compute[186241]: 2025-11-25 06:21:10.131 186245 DEBUG nova.network.neutron [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:21:11 compute-0 nova_compute[186241]: 2025-11-25 06:21:11.665 186245 DEBUG nova.network.neutron [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:21:11 compute-0 nova_compute[186241]: 2025-11-25 06:21:11.902 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.168 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.169 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Instance network_info: |[{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.171 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Start _get_guest_xml network_info=[{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.173 186245 WARNING nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.174 186245 DEBUG nova.virt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-352064862', uuid='394ce10b-bae7-43fa-b133-df28182f99db'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051672.1746202) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.180 186245 DEBUG nova.virt.libvirt.host [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.180 186245 DEBUG nova.virt.libvirt.host [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.183 186245 DEBUG nova.virt.libvirt.host [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.183 186245 DEBUG nova.virt.libvirt.host [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.184 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.184 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.184 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.184 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.185 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.185 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.185 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.185 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.185 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.186 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.186 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.186 186245 DEBUG nova.virt.hardware [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.188 186245 DEBUG nova.virt.libvirt.vif [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:21:03Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.189 186245 DEBUG nova.network.os_vif_util [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.189 186245 DEBUG nova.network.os_vif_util [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:28:be,bridge_name='br-int',has_traffic_filtering=True,id=7704ac5e-d3f5-484e-b018-096af3d84408,network=Network(48e22ff7-b3ad-4c32-9660-e2abd8947790),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7704ac5e-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.190 186245 DEBUG nova.objects.instance [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 394ce10b-bae7-43fa-b133-df28182f99db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.542 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.694 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <uuid>394ce10b-bae7-43fa-b133-df28182f99db</uuid>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <name>instance-00000003</name>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:21:12</nova:creationTime>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:21:12 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <system>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <entry name="serial">394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <entry name="uuid">394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </system>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <os>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   </os>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <features>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   </features>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:21:28:be"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <target dev="tap7704ac5e-d3"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log" append="off"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <video>
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </video>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:21:12 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:21:12 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:21:12 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:21:12 compute-0 nova_compute[186241]: </domain>
Nov 25 06:21:12 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.695 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Preparing to wait for external event network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.695 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.695 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.695 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.696 186245 DEBUG nova.virt.libvirt.vif [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:21:03Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.696 186245 DEBUG nova.network.os_vif_util [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.697 186245 DEBUG nova.network.os_vif_util [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:28:be,bridge_name='br-int',has_traffic_filtering=True,id=7704ac5e-d3f5-484e-b018-096af3d84408,network=Network(48e22ff7-b3ad-4c32-9660-e2abd8947790),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7704ac5e-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.697 186245 DEBUG os_vif [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:28:be,bridge_name='br-int',has_traffic_filtering=True,id=7704ac5e-d3f5-484e-b018-096af3d84408,network=Network(48e22ff7-b3ad-4c32-9660-e2abd8947790),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7704ac5e-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.697 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.698 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.698 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.699 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.699 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '744c0fad-ee42-5910-bcf1-4b6b25c94bb0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.700 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.702 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.704 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.704 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7704ac5e-d3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.705 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7704ac5e-d3, col_values=(('qos', UUID('44cfdfb1-52ac-45e6-8670-4ddeb88ae522')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.705 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7704ac5e-d3, col_values=(('external_ids', {'iface-id': '7704ac5e-d3f5-484e-b018-096af3d84408', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:28:be', 'vm-uuid': '394ce10b-bae7-43fa-b133-df28182f99db'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.706 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 NetworkManager[55345]: <info>  [1764051672.7069] manager: (tap7704ac5e-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.708 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.712 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:12 compute-0 nova_compute[186241]: 2025-11-25 06:21:12.713 186245 INFO os_vif [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:28:be,bridge_name='br-int',has_traffic_filtering=True,id=7704ac5e-d3f5-484e-b018-096af3d84408,network=Network(48e22ff7-b3ad-4c32-9660-e2abd8947790),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7704ac5e-d3')
Nov 25 06:21:12 compute-0 podman[212820]: 2025-11-25 06:21:12.77614339 +0000 UTC m=+0.037690860 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 25 06:21:13 compute-0 ovn_controller[95135]: 2025-11-25T06:21:13Z|00058|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 06:21:14 compute-0 nova_compute[186241]: 2025-11-25 06:21:14.241 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:21:14 compute-0 nova_compute[186241]: 2025-11-25 06:21:14.242 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:21:14 compute-0 nova_compute[186241]: 2025-11-25 06:21:14.242 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:21:28:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:21:14 compute-0 nova_compute[186241]: 2025-11-25 06:21:14.242 186245 INFO nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Using config drive
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.134 186245 INFO nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Creating config drive at /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.138 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprh5y_qs8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.254 186245 DEBUG oslo_concurrency.processutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprh5y_qs8" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:16 compute-0 kernel: tap7704ac5e-d3: entered promiscuous mode
Nov 25 06:21:16 compute-0 NetworkManager[55345]: <info>  [1764051676.3018] manager: (tap7704ac5e-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Nov 25 06:21:16 compute-0 ovn_controller[95135]: 2025-11-25T06:21:16Z|00059|binding|INFO|Claiming lport 7704ac5e-d3f5-484e-b018-096af3d84408 for this chassis.
Nov 25 06:21:16 compute-0 ovn_controller[95135]: 2025-11-25T06:21:16Z|00060|binding|INFO|7704ac5e-d3f5-484e-b018-096af3d84408: Claiming fa:16:3e:21:28:be 10.100.0.8
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.304 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.309 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 systemd-udevd[212857]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:21:16 compute-0 NetworkManager[55345]: <info>  [1764051676.3410] device (tap7704ac5e-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:21:16 compute-0 NetworkManager[55345]: <info>  [1764051676.3420] device (tap7704ac5e-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:21:16 compute-0 systemd-machined[152921]: New machine qemu-3-instance-00000003.
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.364 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Nov 25 06:21:16 compute-0 ovn_controller[95135]: 2025-11-25T06:21:16Z|00061|binding|INFO|Setting lport 7704ac5e-d3f5-484e-b018-096af3d84408 ovn-installed in OVS
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.371 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 ovn_controller[95135]: 2025-11-25T06:21:16Z|00062|binding|INFO|Setting lport 7704ac5e-d3f5-484e-b018-096af3d84408 up in Southbound
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.406 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:28:be 10.100.0.8'], port_security=['fa:16:3e:21:28:be 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '394ce10b-bae7-43fa-b133-df28182f99db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '297c5270-251e-452e-ac3b-951ab3a33218', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad33700d-cdbb-45fc-843f-b6325c07b4bf, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=7704ac5e-d3f5-484e-b018-096af3d84408) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.406 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 7704ac5e-d3f5-484e-b018-096af3d84408 in datapath 48e22ff7-b3ad-4c32-9660-e2abd8947790 bound to our chassis
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.408 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48e22ff7-b3ad-4c32-9660-e2abd8947790
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.417 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f728740c-7fb1-4318-a36b-ccb9bb7fde01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.418 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48e22ff7-b1 in ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.419 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48e22ff7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.419 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4e4f71-aed0-4d0c-85b2-d94e4b0a7ebd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.420 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5f4f9bf5-ef60-45eb-a104-ef01df441aa5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.428 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[9720f9c3-79d5-4ac9-b073-15f0b8275ff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.440 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c796dc8a-2f2b-4ce8-a164-9e34e7f5b129]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.463 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6e983e-8a9d-4806-9e8f-cfc545d8604b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.466 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e7dc1153-ae29-4f8a-a28e-9d263850f735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 NetworkManager[55345]: <info>  [1764051676.4673] manager: (tap48e22ff7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Nov 25 06:21:16 compute-0 systemd-udevd[212860]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.494 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[44ee3407-e0f7-4349-82ed-ae1964032517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.497 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[77767e8c-9ce1-411a-9b09-ea386bf4783a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 NetworkManager[55345]: <info>  [1764051676.5164] device (tap48e22ff7-b0): carrier: link connected
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.520 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[a9644c40-1979-409a-90ea-6ccb90893a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.539 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4410997e-eab6-4dbf-a08f-2bb93a1e6279]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e22ff7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:08:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 266492, 'reachable_time': 19796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212884, 'error': None, 'target': 'ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.552 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f63db6b5-74db-474a-a48a-503a2f9389c5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef3:8c7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 266492, 'tstamp': 266492}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212885, 'error': None, 'target': 'ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.564 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[005a170f-2707-4d6a-ad6c-b83120a5ae0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48e22ff7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:08:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 266492, 'reachable_time': 19796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212886, 'error': None, 'target': 'ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.588 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2cf217-33a8-4ee6-a7a5-dbb027e9c4cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.626 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[30c8ffef-fc16-4307-9fad-1d8094f3b1fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.626 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e22ff7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.627 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.627 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48e22ff7-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:16 compute-0 NetworkManager[55345]: <info>  [1764051676.6293] manager: (tap48e22ff7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 25 06:21:16 compute-0 kernel: tap48e22ff7-b0: entered promiscuous mode
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.636 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.646 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48e22ff7-b0, col_values=(('external_ids', {'iface-id': '78d6d1f7-b362-45aa-9257-0f3da31c1b09'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.647 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 ovn_controller[95135]: 2025-11-25T06:21:16Z|00063|binding|INFO|Releasing lport 78d6d1f7-b362-45aa-9257-0f3da31c1b09 from this chassis (sb_readonly=0)
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.652 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[694d124d-d23a-4321-b3c3-b08153475443]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.660 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.660 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.661 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.661 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 48e22ff7-b3ad-4c32-9660-e2abd8947790 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.661 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.662 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1989955b-e995-46fc-a281-095aa410284d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.664 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.664 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[2a80b9e3-1f81-4482-86da-b5b8af07c14a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.665 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-48e22ff7-b3ad-4c32-9660-e2abd8947790
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 48e22ff7-b3ad-4c32-9660-e2abd8947790
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:21:16 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:16.665 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'env', 'PROCESS_TAG=haproxy-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48e22ff7-b3ad-4c32-9660-e2abd8947790.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:21:16 compute-0 nova_compute[186241]: 2025-11-25 06:21:16.904 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:16 compute-0 podman[212921]: 2025-11-25 06:21:16.979278748 +0000 UTC m=+0.035861810 container create 5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 06:21:17 compute-0 systemd[1]: Started libpod-conmon-5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c.scope.
Nov 25 06:21:17 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:21:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed0b142be243ca2eb3e10ae11550ff6f62f0739263890452d449cfd97506ef67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:21:17 compute-0 podman[212921]: 2025-11-25 06:21:17.054518689 +0000 UTC m=+0.111101761 container init 5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:21:17 compute-0 podman[212921]: 2025-11-25 06:21:17.060000126 +0000 UTC m=+0.116583178 container start 5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 06:21:17 compute-0 podman[212921]: 2025-11-25 06:21:16.964131778 +0000 UTC m=+0.020714850 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:21:17 compute-0 neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790[212934]: [NOTICE]   (212938) : New worker (212940) forked
Nov 25 06:21:17 compute-0 neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790[212934]: [NOTICE]   (212938) : Loading success.
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.344 186245 DEBUG nova.compute.manager [req-cfcb8d4d-5265-471a-8b29-229240332d1d req-c1d65c4c-de72-47ce-becd-7c7320843fd1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.345 186245 DEBUG oslo_concurrency.lockutils [req-cfcb8d4d-5265-471a-8b29-229240332d1d req-c1d65c4c-de72-47ce-becd-7c7320843fd1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.346 186245 DEBUG oslo_concurrency.lockutils [req-cfcb8d4d-5265-471a-8b29-229240332d1d req-c1d65c4c-de72-47ce-becd-7c7320843fd1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.347 186245 DEBUG oslo_concurrency.lockutils [req-cfcb8d4d-5265-471a-8b29-229240332d1d req-c1d65c4c-de72-47ce-becd-7c7320843fd1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.347 186245 DEBUG nova.compute.manager [req-cfcb8d4d-5265-471a-8b29-229240332d1d req-c1d65c4c-de72-47ce-becd-7c7320843fd1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Processing event network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.348 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.351 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.353 186245 INFO nova.virt.libvirt.driver [-] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Instance spawned successfully.
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.353 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.706 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.860 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.861 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.861 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.862 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.862 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.862 186245 DEBUG nova.virt.libvirt.driver [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:21:17 compute-0 nova_compute[186241]: 2025-11-25 06:21:17.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:18 compute-0 nova_compute[186241]: 2025-11-25 06:21:18.369 186245 INFO nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Took 13.44 seconds to spawn the instance on the hypervisor.
Nov 25 06:21:18 compute-0 nova_compute[186241]: 2025-11-25 06:21:18.369 186245 DEBUG nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:21:18 compute-0 nova_compute[186241]: 2025-11-25 06:21:18.880 186245 INFO nova.compute.manager [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Took 18.56 seconds to build instance.
Nov 25 06:21:18 compute-0 nova_compute[186241]: 2025-11-25 06:21:18.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.383 186245 DEBUG oslo_concurrency.lockutils [None req-bc6dc9ac-8211-4265-b7f3-cbda071eab8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.495 186245 DEBUG nova.compute.manager [req-2cc7abb7-798f-49fa-9368-d15666437201 req-b2768462-22e0-401e-a76e-fd4fcc804799 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.496 186245 DEBUG oslo_concurrency.lockutils [req-2cc7abb7-798f-49fa-9368-d15666437201 req-b2768462-22e0-401e-a76e-fd4fcc804799 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.496 186245 DEBUG oslo_concurrency.lockutils [req-2cc7abb7-798f-49fa-9368-d15666437201 req-b2768462-22e0-401e-a76e-fd4fcc804799 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.496 186245 DEBUG oslo_concurrency.lockutils [req-2cc7abb7-798f-49fa-9368-d15666437201 req-b2768462-22e0-401e-a76e-fd4fcc804799 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.496 186245 DEBUG nova.compute.manager [req-2cc7abb7-798f-49fa-9368-d15666437201 req-b2768462-22e0-401e-a76e-fd4fcc804799 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] No waiting events found dispatching network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.496 186245 WARNING nova.compute.manager [req-2cc7abb7-798f-49fa-9368-d15666437201 req-b2768462-22e0-401e-a76e-fd4fcc804799 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received unexpected event network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 for instance with vm_state active and task_state None.
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:19 compute-0 nova_compute[186241]: 2025-11-25 06:21:19.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:20 compute-0 podman[212945]: 2025-11-25 06:21:20.087526578 +0000 UTC m=+0.067027060 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Nov 25 06:21:20 compute-0 nova_compute[186241]: 2025-11-25 06:21:20.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:20 compute-0 nova_compute[186241]: 2025-11-25 06:21:20.440 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:20 compute-0 nova_compute[186241]: 2025-11-25 06:21:20.440 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:20 compute-0 nova_compute[186241]: 2025-11-25 06:21:20.440 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.467 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.523 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.524 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.578 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.789 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.790 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5596MB free_disk=73.02106475830078GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.790 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.790 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:21 compute-0 nova_compute[186241]: 2025-11-25 06:21:21.905 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:22 compute-0 NetworkManager[55345]: <info>  [1764051682.6129] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 25 06:21:22 compute-0 NetworkManager[55345]: <info>  [1764051682.6135] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.614 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:22 compute-0 ovn_controller[95135]: 2025-11-25T06:21:22Z|00064|binding|INFO|Releasing lport 78d6d1f7-b362-45aa-9257-0f3da31c1b09 from this chassis (sb_readonly=0)
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.645 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:22 compute-0 ovn_controller[95135]: 2025-11-25T06:21:22Z|00065|binding|INFO|Releasing lport 78d6d1f7-b362-45aa-9257-0f3da31c1b09 from this chassis (sb_readonly=0)
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.649 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.707 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.825 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 394ce10b-bae7-43fa-b133-df28182f99db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.826 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.826 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:21:22 compute-0 nova_compute[186241]: 2025-11-25 06:21:22.856 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.360 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.380 186245 DEBUG nova.compute.manager [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-changed-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.380 186245 DEBUG nova.compute.manager [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing instance network info cache due to event network-changed-7704ac5e-d3f5-484e-b018-096af3d84408. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.380 186245 DEBUG oslo_concurrency.lockutils [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.380 186245 DEBUG oslo_concurrency.lockutils [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.380 186245 DEBUG nova.network.neutron [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing network info cache for port 7704ac5e-d3f5-484e-b018-096af3d84408 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.866 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:21:23 compute-0 nova_compute[186241]: 2025-11-25 06:21:23.867 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:24 compute-0 podman[212978]: 2025-11-25 06:21:24.066507576 +0000 UTC m=+0.041273336 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Nov 25 06:21:24 compute-0 podman[212979]: 2025-11-25 06:21:24.077196804 +0000 UTC m=+0.051233814 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:21:24 compute-0 nova_compute[186241]: 2025-11-25 06:21:24.863 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:24 compute-0 nova_compute[186241]: 2025-11-25 06:21:24.863 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:25 compute-0 nova_compute[186241]: 2025-11-25 06:21:25.369 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:25 compute-0 nova_compute[186241]: 2025-11-25 06:21:25.370 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:25 compute-0 nova_compute[186241]: 2025-11-25 06:21:25.370 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:21:25 compute-0 nova_compute[186241]: 2025-11-25 06:21:25.371 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:21:26 compute-0 nova_compute[186241]: 2025-11-25 06:21:26.907 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:27 compute-0 ovn_controller[95135]: 2025-11-25T06:21:27Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:21:28:be 10.100.0.8
Nov 25 06:21:27 compute-0 ovn_controller[95135]: 2025-11-25T06:21:27Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:21:28:be 10.100.0.8
Nov 25 06:21:27 compute-0 nova_compute[186241]: 2025-11-25 06:21:27.709 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:28 compute-0 podman[213028]: 2025-11-25 06:21:28.069954285 +0000 UTC m=+0.049675944 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 06:21:30 compute-0 nova_compute[186241]: 2025-11-25 06:21:30.642 186245 DEBUG nova.network.neutron [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updated VIF entry in instance network info cache for port 7704ac5e-d3f5-484e-b018-096af3d84408. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:21:30 compute-0 nova_compute[186241]: 2025-11-25 06:21:30.642 186245 DEBUG nova.network.neutron [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:21:31 compute-0 nova_compute[186241]: 2025-11-25 06:21:31.145 186245 DEBUG oslo_concurrency.lockutils [req-d4393deb-cbaf-470b-9b11-92b8f9db986d req-47e07777-8d93-4ff4-9aaa-9a4461ffd2c1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:21:31 compute-0 nova_compute[186241]: 2025-11-25 06:21:31.909 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:32 compute-0 nova_compute[186241]: 2025-11-25 06:21:32.710 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:33 compute-0 podman[213044]: 2025-11-25 06:21:33.057200649 +0000 UTC m=+0.039030839 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 06:21:33 compute-0 nova_compute[186241]: 2025-11-25 06:21:33.728 186245 INFO nova.compute.manager [None req-5cef6722-2ce9-44a3-8341-3ded0c224e8d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Get console output
Nov 25 06:21:33 compute-0 nova_compute[186241]: 2025-11-25 06:21:33.732 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:21:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:35.221 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:21:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:35.222 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:21:35 compute-0 nova_compute[186241]: 2025-11-25 06:21:35.222 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:36 compute-0 nova_compute[186241]: 2025-11-25 06:21:36.911 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:37 compute-0 nova_compute[186241]: 2025-11-25 06:21:37.712 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:38 compute-0 podman[213064]: 2025-11-25 06:21:38.095971914 +0000 UTC m=+0.071627179 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:21:38 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:38.224 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:38 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:38.481 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:b6:6a 10.100.0.17'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.17/28', 'neutron:device_id': 'ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=466844ed-e541-4d51-b995-b250272e90bc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=65740de4-0640-45af-9a28-560905724021) old=Port_Binding(mac=['fa:16:3e:57:b6:6a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:21:38 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:38.482 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 65740de4-0640-45af-9a28-560905724021 in datapath b395525f-b7c1-4fad-a4fb-afb48a89a77b updated
Nov 25 06:21:38 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:38.484 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b395525f-b7c1-4fad-a4fb-afb48a89a77b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:21:38 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:38.484 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[eeafdc35-3d4a-4388-b574-e22049c77724]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:39 compute-0 nova_compute[186241]: 2025-11-25 06:21:39.918 186245 DEBUG oslo_concurrency.lockutils [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "interface-394ce10b-bae7-43fa-b133-df28182f99db-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:39 compute-0 nova_compute[186241]: 2025-11-25 06:21:39.919 186245 DEBUG oslo_concurrency.lockutils [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-394ce10b-bae7-43fa-b133-df28182f99db-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:39 compute-0 nova_compute[186241]: 2025-11-25 06:21:39.919 186245 DEBUG nova.objects.instance [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'flavor' on Instance uuid 394ce10b-bae7-43fa-b133-df28182f99db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:21:41 compute-0 nova_compute[186241]: 2025-11-25 06:21:41.449 186245 DEBUG nova.objects.instance [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_requests' on Instance uuid 394ce10b-bae7-43fa-b133-df28182f99db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:21:41 compute-0 nova_compute[186241]: 2025-11-25 06:21:41.913 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:41 compute-0 nova_compute[186241]: 2025-11-25 06:21:41.951 186245 DEBUG nova.objects.base [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Object Instance<394ce10b-bae7-43fa-b133-df28182f99db> lazy-loaded attributes: flavor,pci_requests wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Nov 25 06:21:41 compute-0 nova_compute[186241]: 2025-11-25 06:21:41.952 186245 DEBUG nova.network.neutron [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:21:42 compute-0 nova_compute[186241]: 2025-11-25 06:21:42.713 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:43 compute-0 podman[213082]: 2025-11-25 06:21:43.055618384 +0000 UTC m=+0.032516361 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:21:43 compute-0 nova_compute[186241]: 2025-11-25 06:21:43.146 186245 DEBUG nova.policy [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:21:43 compute-0 nova_compute[186241]: 2025-11-25 06:21:43.772 186245 DEBUG nova.network.neutron [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Successfully created port: dc7318f2-544d-40fc-a3e1-24a837e45226 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.193 186245 DEBUG nova.network.neutron [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Successfully updated port: dc7318f2-544d-40fc-a3e1-24a837e45226 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.398 186245 DEBUG nova.compute.manager [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-changed-dc7318f2-544d-40fc-a3e1-24a837e45226 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.399 186245 DEBUG nova.compute.manager [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing instance network info cache due to event network-changed-dc7318f2-544d-40fc-a3e1-24a837e45226. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.399 186245 DEBUG oslo_concurrency.lockutils [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.399 186245 DEBUG oslo_concurrency.lockutils [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.399 186245 DEBUG nova.network.neutron [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing network info cache for port dc7318f2-544d-40fc-a3e1-24a837e45226 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.696 186245 DEBUG oslo_concurrency.lockutils [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:21:46 compute-0 nova_compute[186241]: 2025-11-25 06:21:46.915 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:47.368 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:47.368 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:47.369 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:47 compute-0 nova_compute[186241]: 2025-11-25 06:21:47.715 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:51 compute-0 podman[213104]: 2025-11-25 06:21:51.073457211 +0000 UTC m=+0.052852182 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 06:21:51 compute-0 nova_compute[186241]: 2025-11-25 06:21:51.656 186245 DEBUG nova.network.neutron [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Added VIF to instance network info cache for port dc7318f2-544d-40fc-a3e1-24a837e45226. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3546
Nov 25 06:21:51 compute-0 nova_compute[186241]: 2025-11-25 06:21:51.656 186245 DEBUG nova.network.neutron [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:21:51 compute-0 nova_compute[186241]: 2025-11-25 06:21:51.916 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:52 compute-0 nova_compute[186241]: 2025-11-25 06:21:52.159 186245 DEBUG oslo_concurrency.lockutils [req-2cdcfe8d-65d7-4872-9af5-3f45d458dec4 req-a5ea55be-c968-4379-b65d-1ae98fa74f65 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:21:52 compute-0 nova_compute[186241]: 2025-11-25 06:21:52.160 186245 DEBUG oslo_concurrency.lockutils [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:21:52 compute-0 nova_compute[186241]: 2025-11-25 06:21:52.160 186245 DEBUG nova.network.neutron [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:21:52 compute-0 nova_compute[186241]: 2025-11-25 06:21:52.717 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:53 compute-0 nova_compute[186241]: 2025-11-25 06:21:53.141 186245 WARNING nova.network.neutron [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] b395525f-b7c1-4fad-a4fb-afb48a89a77b already exists in list: networks containing: ['b395525f-b7c1-4fad-a4fb-afb48a89a77b']. ignoring it
Nov 25 06:21:53 compute-0 nova_compute[186241]: 2025-11-25 06:21:53.141 186245 WARNING nova.network.neutron [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] dc7318f2-544d-40fc-a3e1-24a837e45226 already exists in list: port_ids containing: ['dc7318f2-544d-40fc-a3e1-24a837e45226']. ignoring it
Nov 25 06:21:55 compute-0 podman[213127]: 2025-11-25 06:21:55.070950968 +0000 UTC m=+0.045664158 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 06:21:55 compute-0 podman[213128]: 2025-11-25 06:21:55.071971773 +0000 UTC m=+0.044552411 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:21:55 compute-0 nova_compute[186241]: 2025-11-25 06:21:55.636 186245 DEBUG nova.network.neutron [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.139 186245 DEBUG oslo_concurrency.lockutils [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.141 186245 DEBUG nova.virt.libvirt.vif [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:21:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:21:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.142 186245 DEBUG nova.network.os_vif_util [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.142 186245 DEBUG nova.network.os_vif_util [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.143 186245 DEBUG os_vif [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.143 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.143 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.144 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.144 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.144 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1826dbc5-becd-540e-b418-0aad72f8996b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.145 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.147 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.148 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.148 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc7318f2-54, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.149 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapdc7318f2-54, col_values=(('qos', UUID('103bb58b-db71-453e-83bb-d656f76a62d6')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.149 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapdc7318f2-54, col_values=(('external_ids', {'iface-id': 'dc7318f2-544d-40fc-a3e1-24a837e45226', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:56:99', 'vm-uuid': '394ce10b-bae7-43fa-b133-df28182f99db'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.150 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 NetworkManager[55345]: <info>  [1764051716.1510] manager: (tapdc7318f2-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.152 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.155 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.156 186245 INFO os_vif [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54')
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.157 186245 DEBUG nova.virt.libvirt.vif [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:21:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:21:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.157 186245 DEBUG nova.network.os_vif_util [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.158 186245 DEBUG nova.network.os_vif_util [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.160 186245 DEBUG nova.virt.libvirt.guest [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] attach device xml: <interface type="ethernet">
Nov 25 06:21:56 compute-0 nova_compute[186241]:   <mac address="fa:16:3e:a5:56:99"/>
Nov 25 06:21:56 compute-0 nova_compute[186241]:   <model type="virtio"/>
Nov 25 06:21:56 compute-0 nova_compute[186241]:   <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:21:56 compute-0 nova_compute[186241]:   <mtu size="1442"/>
Nov 25 06:21:56 compute-0 nova_compute[186241]:   <target dev="tapdc7318f2-54"/>
Nov 25 06:21:56 compute-0 nova_compute[186241]: </interface>
Nov 25 06:21:56 compute-0 nova_compute[186241]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:336
Nov 25 06:21:56 compute-0 kernel: tapdc7318f2-54: entered promiscuous mode
Nov 25 06:21:56 compute-0 NetworkManager[55345]: <info>  [1764051716.1693] manager: (tapdc7318f2-54): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.171 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 ovn_controller[95135]: 2025-11-25T06:21:56Z|00066|binding|INFO|Claiming lport dc7318f2-544d-40fc-a3e1-24a837e45226 for this chassis.
Nov 25 06:21:56 compute-0 ovn_controller[95135]: 2025-11-25T06:21:56Z|00067|binding|INFO|dc7318f2-544d-40fc-a3e1-24a837e45226: Claiming fa:16:3e:a5:56:99 10.100.0.27
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.175 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.178 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:56:99 10.100.0.27'], port_security=['fa:16:3e:a5:56:99 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '394ce10b-bae7-43fa-b133-df28182f99db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dafbde12-3514-4e2d-980f-9529576187d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=466844ed-e541-4d51-b995-b250272e90bc, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=dc7318f2-544d-40fc-a3e1-24a837e45226) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.179 103953 INFO neutron.agent.ovn.metadata.agent [-] Port dc7318f2-544d-40fc-a3e1-24a837e45226 in datapath b395525f-b7c1-4fad-a4fb-afb48a89a77b bound to our chassis
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.180 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b395525f-b7c1-4fad-a4fb-afb48a89a77b
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.188 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d265c91d-7469-4076-af46-2ce10d3a6a3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.188 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb395525f-b1 in ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.190 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb395525f-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.190 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ed60f202-5c6c-4024-90e5-8438a4716bc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.190 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[60ef1c91-c7d6-4d0d-8b6c-1b774165f439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 systemd-udevd[213173]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.199 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[833458a7-3313-41fd-a9fc-3197ddd2b517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 NetworkManager[55345]: <info>  [1764051716.2080] device (tapdc7318f2-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:21:56 compute-0 NetworkManager[55345]: <info>  [1764051716.2088] device (tapdc7318f2-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:21:56 compute-0 ovn_controller[95135]: 2025-11-25T06:21:56Z|00068|binding|INFO|Setting lport dc7318f2-544d-40fc-a3e1-24a837e45226 ovn-installed in OVS
Nov 25 06:21:56 compute-0 ovn_controller[95135]: 2025-11-25T06:21:56Z|00069|binding|INFO|Setting lport dc7318f2-544d-40fc-a3e1-24a837e45226 up in Southbound
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.214 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.222 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5929538f-9a55-4787-91e1-6ac45bc33ec7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.246 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[03e0cc40-9ec4-44c0-9d66-aa40f31c8ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 NetworkManager[55345]: <info>  [1764051716.2514] manager: (tapb395525f-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.252 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7f336827-9ee9-44d3-b788-a66c5d38671a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 systemd-udevd[213176]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.274 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[d3bbaf94-71ce-4346-bf5c-5cba6b6ea389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.276 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[f0affea2-9947-4fb1-a224-a55722ccc2f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 NetworkManager[55345]: <info>  [1764051716.2908] device (tapb395525f-b0): carrier: link connected
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.295 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[a1723769-0d42-4522-b780-a4b2bb475ebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.310 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[74d2eb1a-53d4-4436-8f07-7bb6d4ca28a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb395525f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:b6:6a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 270469, 'reachable_time': 37704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213192, 'error': None, 'target': 'ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.322 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5f45dc80-e08c-49f5-aef4-fb5112d0a88f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:b66a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 270469, 'tstamp': 270469}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213193, 'error': None, 'target': 'ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.334 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[dcbbe348-8edd-4e72-ba3e-448d114b7457]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb395525f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:b6:6a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 270469, 'reachable_time': 37704, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213194, 'error': None, 'target': 'ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.356 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5abca91a-2eeb-4059-8c0d-ac9c9635ed25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.397 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[96f04c7e-a834-4d4a-982f-0c9c6119fe87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.398 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb395525f-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.398 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.398 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb395525f-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 NetworkManager[55345]: <info>  [1764051716.4006] manager: (tapb395525f-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.400 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 kernel: tapb395525f-b0: entered promiscuous mode
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.402 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.403 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb395525f-b0, col_values=(('external_ids', {'iface-id': '65740de4-0640-45af-9a28-560905724021'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.404 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.405 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 ovn_controller[95135]: 2025-11-25T06:21:56Z|00070|binding|INFO|Releasing lport 65740de4-0640-45af-9a28-560905724021 from this chassis (sb_readonly=0)
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.406 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ca290847-464a-4631-bb15-16578806e302]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.406 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.406 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.406 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for b395525f-b7c1-4fad-a4fb-afb48a89a77b disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.407 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.407 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[beb0a759-021f-4a75-ab25-0da13b90737d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.407 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.407 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b8789b-8891-4594-a969-7486fc8f59a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.408 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-b395525f-b7c1-4fad-a4fb-afb48a89a77b
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID b395525f-b7c1-4fad-a4fb-afb48a89a77b
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:21:56 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:21:56.408 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'env', 'PROCESS_TAG=haproxy-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b395525f-b7c1-4fad-a4fb-afb48a89a77b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.417 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:56 compute-0 podman[213222]: 2025-11-25 06:21:56.695108319 +0000 UTC m=+0.039127170 container create 5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:21:56 compute-0 systemd[1]: Started libpod-conmon-5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428.scope.
Nov 25 06:21:56 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:21:56 compute-0 podman[213222]: 2025-11-25 06:21:56.674127662 +0000 UTC m=+0.018146533 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:21:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da198403cdc73db998354b51bb5a0347f8775ba758d24f235891fa4d1aaa5e45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:21:56 compute-0 podman[213222]: 2025-11-25 06:21:56.803154169 +0000 UTC m=+0.147173030 container init 5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 06:21:56 compute-0 podman[213222]: 2025-11-25 06:21:56.807282365 +0000 UTC m=+0.151301215 container start 5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:21:56 compute-0 neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b[213234]: [NOTICE]   (213238) : New worker (213240) forked
Nov 25 06:21:56 compute-0 neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b[213234]: [NOTICE]   (213238) : Loading success.
Nov 25 06:21:56 compute-0 nova_compute[186241]: 2025-11-25 06:21:56.917 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.336 186245 DEBUG nova.compute.manager [req-0675d3ca-e5eb-4e00-bc6e-8463c40f9032 req-90f94d36-c93f-44bf-b232-0cec9c2e9845 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.337 186245 DEBUG oslo_concurrency.lockutils [req-0675d3ca-e5eb-4e00-bc6e-8463c40f9032 req-90f94d36-c93f-44bf-b232-0cec9c2e9845 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.337 186245 DEBUG oslo_concurrency.lockutils [req-0675d3ca-e5eb-4e00-bc6e-8463c40f9032 req-90f94d36-c93f-44bf-b232-0cec9c2e9845 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.337 186245 DEBUG oslo_concurrency.lockutils [req-0675d3ca-e5eb-4e00-bc6e-8463c40f9032 req-90f94d36-c93f-44bf-b232-0cec9c2e9845 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.337 186245 DEBUG nova.compute.manager [req-0675d3ca-e5eb-4e00-bc6e-8463c40f9032 req-90f94d36-c93f-44bf-b232-0cec9c2e9845 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] No waiting events found dispatching network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.338 186245 WARNING nova.compute.manager [req-0675d3ca-e5eb-4e00-bc6e-8463c40f9032 req-90f94d36-c93f-44bf-b232-0cec9c2e9845 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received unexpected event network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 for instance with vm_state active and task_state None.
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.733 186245 DEBUG nova.virt.libvirt.driver [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.733 186245 DEBUG nova.virt.libvirt.driver [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.734 186245 DEBUG nova.virt.libvirt.driver [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:21:28:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:21:57 compute-0 nova_compute[186241]: 2025-11-25 06:21:57.734 186245 DEBUG nova.virt.libvirt.driver [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:a5:56:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:21:58 compute-0 nova_compute[186241]: 2025-11-25 06:21:58.239 186245 DEBUG nova.virt.driver [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-352064862', uuid='394ce10b-bae7-43fa-b133-df28182f99db'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051718.2389426) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:21:58 compute-0 nova_compute[186241]: 2025-11-25 06:21:58.240 186245 DEBUG nova.virt.libvirt.guest [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:21:58 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:21:58</nova:creationTime>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:21:58 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     <nova:port uuid="dc7318f2-544d-40fc-a3e1-24a837e45226">
Nov 25 06:21:58 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Nov 25 06:21:58 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:21:58 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:21:58 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:21:58 compute-0 nova_compute[186241]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Nov 25 06:21:58 compute-0 nova_compute[186241]: 2025-11-25 06:21:58.747 186245 DEBUG oslo_concurrency.lockutils [None req-e8755c6d-3c81-4856-ad99-dd5145d5c002 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-394ce10b-bae7-43fa-b133-df28182f99db-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 18.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:59 compute-0 ovn_controller[95135]: 2025-11-25T06:21:59Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a5:56:99 10.100.0.27
Nov 25 06:21:59 compute-0 ovn_controller[95135]: 2025-11-25T06:21:59Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a5:56:99 10.100.0.27
Nov 25 06:21:59 compute-0 podman[213247]: 2025-11-25 06:21:59.065030927 +0000 UTC m=+0.039673180 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 25 06:21:59 compute-0 nova_compute[186241]: 2025-11-25 06:21:59.476 186245 DEBUG nova.compute.manager [req-0d4d9ecc-2ad9-4812-ba07-6993ad943ac4 req-54346f2d-8b7b-420c-99ff-d33a434effae a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:21:59 compute-0 nova_compute[186241]: 2025-11-25 06:21:59.476 186245 DEBUG oslo_concurrency.lockutils [req-0d4d9ecc-2ad9-4812-ba07-6993ad943ac4 req-54346f2d-8b7b-420c-99ff-d33a434effae a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:21:59 compute-0 nova_compute[186241]: 2025-11-25 06:21:59.477 186245 DEBUG oslo_concurrency.lockutils [req-0d4d9ecc-2ad9-4812-ba07-6993ad943ac4 req-54346f2d-8b7b-420c-99ff-d33a434effae a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:21:59 compute-0 nova_compute[186241]: 2025-11-25 06:21:59.477 186245 DEBUG oslo_concurrency.lockutils [req-0d4d9ecc-2ad9-4812-ba07-6993ad943ac4 req-54346f2d-8b7b-420c-99ff-d33a434effae a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:21:59 compute-0 nova_compute[186241]: 2025-11-25 06:21:59.477 186245 DEBUG nova.compute.manager [req-0d4d9ecc-2ad9-4812-ba07-6993ad943ac4 req-54346f2d-8b7b-420c-99ff-d33a434effae a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] No waiting events found dispatching network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:21:59 compute-0 nova_compute[186241]: 2025-11-25 06:21:59.477 186245 WARNING nova.compute.manager [req-0d4d9ecc-2ad9-4812-ba07-6993ad943ac4 req-54346f2d-8b7b-420c-99ff-d33a434effae a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received unexpected event network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 for instance with vm_state active and task_state None.
Nov 25 06:21:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:21:59.549 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:21:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:21:59.552 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/394ce10b-bae7-43fa-b133-df28182f99db -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e471cc3fc7ae9ac5d8fd794e8aefa20e5f5c77c3e3edccb41964d2d46a7818d3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.415 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 2148 Content-Type: application/json Date: Tue, 25 Nov 2025 06:21:59 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-3e448e6a-3320-494b-94be-01b1965e59dc x-openstack-request-id: req-3e448e6a-3320-494b-94be-01b1965e59dc _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.415 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "394ce10b-bae7-43fa-b133-df28182f99db", "name": "tempest-TestNetworkBasicOps-server-352064862", "status": "ACTIVE", "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "user_id": "66a05d0ca82146a5a458244c8e5364de", "metadata": {}, "hostId": "d6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5", "image": {"id": "5215c26e-be2f-40b4-ac47-476bfa3cf3f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/5215c26e-be2f-40b4-ac47-476bfa3cf3f2"}]}, "flavor": {"id": "53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac"}]}, "created": "2025-11-25T06:20:58Z", "updated": "2025-11-25T06:21:18Z", "addresses": {"tempest-network-smoke--1925175189": [{"version": 4, "addr": "10.100.0.8", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:21:28:be"}, {"version": 4, "addr": "192.168.122.215", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:21:28:be"}], "tempest-network-smoke--105654638": [{"version": 4, "addr": "10.100.0.27", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:a5:56:99"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/394ce10b-bae7-43fa-b133-df28182f99db"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/394ce10b-bae7-43fa-b133-df28182f99db"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-1679774923", "OS-SRV-USG:launched_at": "2025-11-25T06:21:18.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-852580020"}, {"name": "default"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000003", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.415 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/394ce10b-bae7-43fa-b133-df28182f99db used request id req-3e448e6a-3320-494b-94be-01b1965e59dc request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.416 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '394ce10b-bae7-43fa-b133-df28182f99db', 'name': 'tempest-TestNetworkBasicOps-server-352064862', 'flavor': {'id': '53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd90b557db9104ecfb816b1cdab8712bd', 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'hostId': 'd6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.416 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.416 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.416 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.416 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.417 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2025-11-25T06:22:00.416792) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.436 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.read.requests volume: 1075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.436 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.437 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.437 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.437 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.437 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.437 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.437 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.437 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2025-11-25T06:22:00.437668) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.439 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 394ce10b-bae7-43fa-b133-df28182f99db / tap7704ac5e-d3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.440 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 394ce10b-bae7-43fa-b133-df28182f99db / tapdc7318f2-54 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.440 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.440 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.440 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.440 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.440 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.write.latency volume: 346642577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.441 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.442 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.442 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.442 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.442 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.442 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.442 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.442 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.443 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.443 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.443 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.443 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.443 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2025-11-25T06:22:00.441143) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.443 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2025-11-25T06:22:00.442220) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.443 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2025-11-25T06:22:00.443277) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.454 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/cpu volume: 10020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.455 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.455 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.455 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.455 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.455 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.455 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.455 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2025-11-25T06:22:00.455658) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2025-11-25T06:22:00.456427) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.read.bytes volume: 29997568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.456 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2025-11-25T06:22:00.457480) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.packets volume: 74 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.457 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2025-11-25T06:22:00.458555) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-352064862>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-352064862>]
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.458 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2025-11-25T06:22:00.459287) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.bytes volume: 12178 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.459 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/memory.usage volume: 42.9140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.460 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2025-11-25T06:22:00.460354) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2025-11-25T06:22:00.461239) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.write.bytes volume: 72990720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.461 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2025-11-25T06:22:00.462280) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.462 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2025-11-25T06:22:00.463014) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.463 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2025-11-25T06:22:00.464104) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.464 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2025-11-25T06:22:00.464920) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.465 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2025-11-25T06:22:00.465965) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.466 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2025-11-25T06:22:00.467011) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.read.latency volume: 204839210 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.read.latency volume: 90992395 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.467 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2025-11-25T06:22:00.468019) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-352064862>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-352064862>]
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.468 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2025-11-25T06:22:00.468706) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.475 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.475 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2025-11-25T06:22:00.476512) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.bytes volume: 15180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.476 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.bytes volume: 1330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.477 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2025-11-25T06:22:00.477596) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2025-11-25T06:22:00.478711) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.478 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.479 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2025-11-25T06:22:00.479758) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.480 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2025-11-25T06:22:00.480879) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.481 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.482 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.packets volume: 75 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.482 16 DEBUG ceilometer.compute.pollsters [-] 394ce10b-bae7-43fa-b133-df28182f99db/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.482 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:22:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:22:00.482 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2025-11-25T06:22:00.481917) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:22:01 compute-0 nova_compute[186241]: 2025-11-25 06:22:01.151 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:01 compute-0 nova_compute[186241]: 2025-11-25 06:22:01.920 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:02 compute-0 nova_compute[186241]: 2025-11-25 06:22:02.045 186245 DEBUG oslo_concurrency.lockutils [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "interface-394ce10b-bae7-43fa-b133-df28182f99db-dc7318f2-544d-40fc-a3e1-24a837e45226" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:02 compute-0 nova_compute[186241]: 2025-11-25 06:22:02.045 186245 DEBUG oslo_concurrency.lockutils [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-394ce10b-bae7-43fa-b133-df28182f99db-dc7318f2-544d-40fc-a3e1-24a837e45226" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:02 compute-0 nova_compute[186241]: 2025-11-25 06:22:02.550 186245 DEBUG nova.objects.instance [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'flavor' on Instance uuid 394ce10b-bae7-43fa-b133-df28182f99db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.054 186245 DEBUG nova.virt.libvirt.vif [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:21:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:21:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.054 186245 DEBUG nova.network.os_vif_util [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.055 186245 DEBUG nova.network.os_vif_util [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.057 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.058 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.060 186245 DEBUG nova.virt.libvirt.driver [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Attempting to detach device tapdc7318f2-54 from instance 394ce10b-bae7-43fa-b133-df28182f99db from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2637
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.060 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] detach device xml: <interface type="ethernet">
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <mac address="fa:16:3e:a5:56:99"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <model type="virtio"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <mtu size="1442"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <target dev="tapdc7318f2-54"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]: </interface>
Nov 25 06:22:03 compute-0 nova_compute[186241]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.064 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.066 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <name>instance-00000003</name>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <uuid>394ce10b-bae7-43fa-b133-df28182f99db</uuid>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:21:58</nova:creationTime>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <nova:port uuid="dc7318f2-544d-40fc-a3e1-24a837e45226">
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:22:03 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <system>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <entry name='serial'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <entry name='uuid'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </system>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <os>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </os>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <features>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </features>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk' index='2'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config' index='1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:21:28:be'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target dev='tap7704ac5e-d3'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:a5:56:99'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target dev='tapdc7318f2-54'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='net1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       </target>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </console>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <video>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </video>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c326,c359</label>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c326,c359</imagelabel>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:22:03 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:03 compute-0 nova_compute[186241]: </domain>
Nov 25 06:22:03 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.067 186245 INFO nova.virt.libvirt.driver [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully detached device tapdc7318f2-54 from instance 394ce10b-bae7-43fa-b133-df28182f99db from the persistent domain config.
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.067 186245 DEBUG nova.virt.libvirt.driver [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] (1/8): Attempting to detach device tapdc7318f2-54 with device alias net1 from instance 394ce10b-bae7-43fa-b133-df28182f99db from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2673
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.068 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] detach device xml: <interface type="ethernet">
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <mac address="fa:16:3e:a5:56:99"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <model type="virtio"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <mtu size="1442"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]:   <target dev="tapdc7318f2-54"/>
Nov 25 06:22:03 compute-0 nova_compute[186241]: </interface>
Nov 25 06:22:03 compute-0 nova_compute[186241]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Nov 25 06:22:03 compute-0 kernel: tapdc7318f2-54 (unregistering): left promiscuous mode
Nov 25 06:22:03 compute-0 NetworkManager[55345]: <info>  [1764051723.1165] device (tapdc7318f2-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:22:03 compute-0 ovn_controller[95135]: 2025-11-25T06:22:03Z|00071|binding|INFO|Releasing lport dc7318f2-544d-40fc-a3e1-24a837e45226 from this chassis (sb_readonly=0)
Nov 25 06:22:03 compute-0 ovn_controller[95135]: 2025-11-25T06:22:03Z|00072|binding|INFO|Setting lport dc7318f2-544d-40fc-a3e1-24a837e45226 down in Southbound
Nov 25 06:22:03 compute-0 ovn_controller[95135]: 2025-11-25T06:22:03Z|00073|binding|INFO|Removing iface tapdc7318f2-54 ovn-installed in OVS
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.118 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.119 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.123 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:56:99 10.100.0.27'], port_security=['fa:16:3e:a5:56:99 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '394ce10b-bae7-43fa-b133-df28182f99db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dafbde12-3514-4e2d-980f-9529576187d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=466844ed-e541-4d51-b995-b250272e90bc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=dc7318f2-544d-40fc-a3e1-24a837e45226) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.125 103953 INFO neutron.agent.ovn.metadata.agent [-] Port dc7318f2-544d-40fc-a3e1-24a837e45226 in datapath b395525f-b7c1-4fad-a4fb-afb48a89a77b unbound from our chassis
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.126 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b395525f-b7c1-4fad-a4fb-afb48a89a77b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.130 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb29345-2936-4183-8cbd-3d7062dbec95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.130 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b namespace which is not needed anymore
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.136 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.140 186245 DEBUG nova.virt.libvirt.driver [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Start waiting for the detach event from libvirt for device tapdc7318f2-54 with device alias net1 for instance 394ce10b-bae7-43fa-b133-df28182f99db _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2749
Nov 25 06:22:03 compute-0 podman[213265]: 2025-11-25 06:22:03.205926822 +0000 UTC m=+0.076489580 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 06:22:03 compute-0 neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b[213234]: [NOTICE]   (213238) : haproxy version is 2.8.14-c23fe91
Nov 25 06:22:03 compute-0 neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b[213234]: [NOTICE]   (213238) : path to executable is /usr/sbin/haproxy
Nov 25 06:22:03 compute-0 neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b[213234]: [WARNING]  (213238) : Exiting Master process...
Nov 25 06:22:03 compute-0 podman[213302]: 2025-11-25 06:22:03.222816132 +0000 UTC m=+0.025299242 container kill 5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 06:22:03 compute-0 neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b[213234]: [ALERT]    (213238) : Current worker (213240) exited with code 143 (Terminated)
Nov 25 06:22:03 compute-0 neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b[213234]: [WARNING]  (213238) : All workers exited. Exiting... (0)
Nov 25 06:22:03 compute-0 systemd[1]: libpod-5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428.scope: Deactivated successfully.
Nov 25 06:22:03 compute-0 conmon[213234]: conmon 5689ca87897436de8c1f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428.scope/container/memory.events
Nov 25 06:22:03 compute-0 podman[213315]: 2025-11-25 06:22:03.257991759 +0000 UTC m=+0.022021942 container died 5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 25 06:22:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428-userdata-shm.mount: Deactivated successfully.
Nov 25 06:22:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-da198403cdc73db998354b51bb5a0347f8775ba758d24f235891fa4d1aaa5e45-merged.mount: Deactivated successfully.
Nov 25 06:22:03 compute-0 podman[213315]: 2025-11-25 06:22:03.277743228 +0000 UTC m=+0.041773401 container cleanup 5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 25 06:22:03 compute-0 systemd[1]: libpod-conmon-5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428.scope: Deactivated successfully.
Nov 25 06:22:03 compute-0 podman[213317]: 2025-11-25 06:22:03.287307616 +0000 UTC m=+0.044337575 container remove 5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.300 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f902ed-3d94-453a-9c1e-befda6ccc490]: (4, ("Tue Nov 25 06:22:03 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b (5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428)\n5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428\nTue Nov 25 06:22:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b (5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428)\n5689ca87897436de8c1fee78a3e79008209a5bbfec0bb13d7943404425bb6428\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.302 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[521f4807-c87c-4db4-b3bc-adb16b252804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.302 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b395525f-b7c1-4fad-a4fb-afb48a89a77b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.302 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[78dfd2f8-181f-4d25-8e01-eee768fdd032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.303 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb395525f-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.304 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:03 compute-0 kernel: tapb395525f-b0: left promiscuous mode
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.323 186245 DEBUG nova.compute.manager [req-1773334f-10f2-4068-a5a0-caf935af62ab req-a0e8cfef-1e1a-4bab-9418-a0b1613bd636 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-unplugged-dc7318f2-544d-40fc-a3e1-24a837e45226 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.323 186245 DEBUG oslo_concurrency.lockutils [req-1773334f-10f2-4068-a5a0-caf935af62ab req-a0e8cfef-1e1a-4bab-9418-a0b1613bd636 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.323 186245 DEBUG oslo_concurrency.lockutils [req-1773334f-10f2-4068-a5a0-caf935af62ab req-a0e8cfef-1e1a-4bab-9418-a0b1613bd636 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.323 186245 DEBUG oslo_concurrency.lockutils [req-1773334f-10f2-4068-a5a0-caf935af62ab req-a0e8cfef-1e1a-4bab-9418-a0b1613bd636 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.323 186245 DEBUG nova.compute.manager [req-1773334f-10f2-4068-a5a0-caf935af62ab req-a0e8cfef-1e1a-4bab-9418-a0b1613bd636 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] No waiting events found dispatching network-vif-unplugged-dc7318f2-544d-40fc-a3e1-24a837e45226 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.324 186245 WARNING nova.compute.manager [req-1773334f-10f2-4068-a5a0-caf935af62ab req-a0e8cfef-1e1a-4bab-9418-a0b1613bd636 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received unexpected event network-vif-unplugged-dc7318f2-544d-40fc-a3e1-24a837e45226 for instance with vm_state active and task_state None.
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.324 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5eeb725a-1a42-4e05-bf55-f17aa5d30062]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 nova_compute[186241]: 2025-11-25 06:22:03.324 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.333 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[08cf9f8e-86c6-4308-a7eb-d1689813349b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.334 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[42bf5233-8b86-432a-a23c-66015255169a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.344 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e3830086-c2ff-4b74-b070-f26bf01e265c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 270464, 'reachable_time': 36860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213344, 'error': None, 'target': 'ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:03 compute-0 systemd[1]: run-netns-ovnmeta\x2db395525f\x2db7c1\x2d4fad\x2da4fb\x2dafb48a89a77b.mount: Deactivated successfully.
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.346 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b395525f-b7c1-4fad-a4fb-afb48a89a77b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:22:03 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:03.346 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3acf32-ed78-418f-b738-d767a97ab50c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:05 compute-0 nova_compute[186241]: 2025-11-25 06:22:05.479 186245 DEBUG nova.compute.manager [req-c88b1f0b-4b0f-45a3-b9c7-cb6199c08f38 req-722a0679-3d4c-43d3-a159-d750710ddebb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:22:05 compute-0 nova_compute[186241]: 2025-11-25 06:22:05.479 186245 DEBUG oslo_concurrency.lockutils [req-c88b1f0b-4b0f-45a3-b9c7-cb6199c08f38 req-722a0679-3d4c-43d3-a159-d750710ddebb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:05 compute-0 nova_compute[186241]: 2025-11-25 06:22:05.480 186245 DEBUG oslo_concurrency.lockutils [req-c88b1f0b-4b0f-45a3-b9c7-cb6199c08f38 req-722a0679-3d4c-43d3-a159-d750710ddebb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:05 compute-0 nova_compute[186241]: 2025-11-25 06:22:05.480 186245 DEBUG oslo_concurrency.lockutils [req-c88b1f0b-4b0f-45a3-b9c7-cb6199c08f38 req-722a0679-3d4c-43d3-a159-d750710ddebb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:05 compute-0 nova_compute[186241]: 2025-11-25 06:22:05.480 186245 DEBUG nova.compute.manager [req-c88b1f0b-4b0f-45a3-b9c7-cb6199c08f38 req-722a0679-3d4c-43d3-a159-d750710ddebb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] No waiting events found dispatching network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:22:05 compute-0 nova_compute[186241]: 2025-11-25 06:22:05.480 186245 WARNING nova.compute.manager [req-c88b1f0b-4b0f-45a3-b9c7-cb6199c08f38 req-722a0679-3d4c-43d3-a159-d750710ddebb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received unexpected event network-vif-plugged-dc7318f2-544d-40fc-a3e1-24a837e45226 for instance with vm_state active and task_state None.
Nov 25 06:22:06 compute-0 nova_compute[186241]: 2025-11-25 06:22:06.154 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:06 compute-0 nova_compute[186241]: 2025-11-25 06:22:06.921 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:09 compute-0 podman[213345]: 2025-11-25 06:22:09.067129723 +0000 UTC m=+0.044917680 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 25 06:22:11 compute-0 nova_compute[186241]: 2025-11-25 06:22:11.155 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:11 compute-0 nova_compute[186241]: 2025-11-25 06:22:11.922 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:14 compute-0 podman[213362]: 2025-11-25 06:22:14.05810081 +0000 UTC m=+0.037075301 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 25 06:22:16 compute-0 nova_compute[186241]: 2025-11-25 06:22:16.158 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:16 compute-0 nova_compute[186241]: 2025-11-25 06:22:16.924 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:17 compute-0 nova_compute[186241]: 2025-11-25 06:22:17.933 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:18 compute-0 nova_compute[186241]: 2025-11-25 06:22:18.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:20 compute-0 nova_compute[186241]: 2025-11-25 06:22:20.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:20 compute-0 nova_compute[186241]: 2025-11-25 06:22:20.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:20 compute-0 nova_compute[186241]: 2025-11-25 06:22:20.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:21 compute-0 nova_compute[186241]: 2025-11-25 06:22:21.160 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:21 compute-0 nova_compute[186241]: 2025-11-25 06:22:21.456 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:21 compute-0 nova_compute[186241]: 2025-11-25 06:22:21.457 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:21 compute-0 nova_compute[186241]: 2025-11-25 06:22:21.457 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:21 compute-0 nova_compute[186241]: 2025-11-25 06:22:21.457 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:22:21 compute-0 nova_compute[186241]: 2025-11-25 06:22:21.925 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:22 compute-0 podman[213385]: 2025-11-25 06:22:22.078808431 +0000 UTC m=+0.057550162 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.481 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.524 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.525 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.567 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.748 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.749 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5620MB free_disk=72.99286651611328GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.749 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:22 compute-0 nova_compute[186241]: 2025-11-25 06:22:22.749 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.141 186245 WARNING nova.virt.libvirt.driver [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for libvirt event about the detach of device tapdc7318f2-54 with device alias net1 from instance 394ce10b-bae7-43fa-b133-df28182f99db is timed out.
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.142 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.144 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <name>instance-00000003</name>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <uuid>394ce10b-bae7-43fa-b133-df28182f99db</uuid>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:21:58</nova:creationTime>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:port uuid="dc7318f2-544d-40fc-a3e1-24a837e45226">
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:22:23 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <system>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <entry name='serial'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <entry name='uuid'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </system>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <os>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </os>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <features>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </features>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk' index='2'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config' index='1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:21:28:be'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target dev='tap7704ac5e-d3'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       </target>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </console>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <video>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </video>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c326,c359</label>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c326,c359</imagelabel>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:23 compute-0 nova_compute[186241]: </domain>
Nov 25 06:22:23 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.145 186245 INFO nova.virt.libvirt.driver [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully detached device tapdc7318f2-54 from instance 394ce10b-bae7-43fa-b133-df28182f99db from the live domain config.
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.146 186245 DEBUG nova.virt.libvirt.vif [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:21:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:21:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.146 186245 DEBUG nova.network.os_vif_util [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.146 186245 DEBUG nova.network.os_vif_util [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.147 186245 DEBUG os_vif [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.148 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.149 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc7318f2-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.150 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.152 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.152 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.153 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=103bb58b-db71-453e-83bb-d656f76a62d6) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.155 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.157 186245 INFO os_vif [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54')
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.158 186245 DEBUG nova.virt.driver [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-352064862', uuid='394ce10b-bae7-43fa-b133-df28182f99db'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051743.1579368) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.158 186245 DEBUG nova.virt.libvirt.guest [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:22:23</nova:creationTime>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:22:23 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:22:23 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:23 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:22:23 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:22:23 compute-0 nova_compute[186241]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.792 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 394ce10b-bae7-43fa-b133-df28182f99db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.793 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.793 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:22:23 compute-0 nova_compute[186241]: 2025-11-25 06:22:23.822 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:22:24 compute-0 nova_compute[186241]: 2025-11-25 06:22:24.327 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:22:24 compute-0 nova_compute[186241]: 2025-11-25 06:22:24.328 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:22:24 compute-0 nova_compute[186241]: 2025-11-25 06:22:24.328 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:25 compute-0 nova_compute[186241]: 2025-11-25 06:22:25.324 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:25 compute-0 nova_compute[186241]: 2025-11-25 06:22:25.324 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:25 compute-0 nova_compute[186241]: 2025-11-25 06:22:25.325 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:22:25 compute-0 nova_compute[186241]: 2025-11-25 06:22:25.325 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:22:26 compute-0 podman[213414]: 2025-11-25 06:22:26.067981476 +0000 UTC m=+0.044105448 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:22:26 compute-0 podman[213415]: 2025-11-25 06:22:26.090070183 +0000 UTC m=+0.065425976 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:22:26 compute-0 nova_compute[186241]: 2025-11-25 06:22:26.421 186245 DEBUG nova.compute.manager [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-deleted-dc7318f2-544d-40fc-a3e1-24a837e45226 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:22:26 compute-0 nova_compute[186241]: 2025-11-25 06:22:26.421 186245 INFO nova.compute.manager [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Neutron deleted interface dc7318f2-544d-40fc-a3e1-24a837e45226; detaching it from the instance and deleting it from the info cache
Nov 25 06:22:26 compute-0 nova_compute[186241]: 2025-11-25 06:22:26.421 186245 DEBUG nova.network.neutron [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:22:26 compute-0 nova_compute[186241]: 2025-11-25 06:22:26.926 186245 DEBUG nova.objects.instance [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lazy-loading 'system_metadata' on Instance uuid 394ce10b-bae7-43fa-b133-df28182f99db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:22:26 compute-0 nova_compute[186241]: 2025-11-25 06:22:26.927 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.184 186245 DEBUG oslo_concurrency.lockutils [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.185 186245 DEBUG oslo_concurrency.lockutils [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.186 186245 DEBUG nova.network.neutron [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:22:27 compute-0 ovn_controller[95135]: 2025-11-25T06:22:27Z|00074|binding|INFO|Releasing lport 78d6d1f7-b362-45aa-9257-0f3da31c1b09 from this chassis (sb_readonly=0)
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.341 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.431 186245 DEBUG nova.objects.instance [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lazy-loading 'flavor' on Instance uuid 394ce10b-bae7-43fa-b133-df28182f99db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.935 186245 DEBUG nova.objects.base [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Object Instance<394ce10b-bae7-43fa-b133-df28182f99db> lazy-loaded attributes: system_metadata,flavor wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.936 186245 DEBUG nova.virt.libvirt.vif [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:21:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:21:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.936 186245 DEBUG nova.network.os_vif_util [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converting VIF {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.937 186245 DEBUG nova.network.os_vif_util [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.938 186245 DEBUG nova.virt.libvirt.guest [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.940 186245 DEBUG nova.virt.libvirt.guest [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <name>instance-00000003</name>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <uuid>394ce10b-bae7-43fa-b133-df28182f99db</uuid>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:22:23</nova:creationTime>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:22:27 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <system>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='serial'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='uuid'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </system>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <os>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </os>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <features>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </features>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk' index='2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config' index='1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:21:28:be'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target dev='tap7704ac5e-d3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       </target>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </console>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <video>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </video>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c326,c359</label>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c326,c359</imagelabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]: </domain>
Nov 25 06:22:27 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.941 186245 DEBUG nova.virt.libvirt.guest [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.944 186245 DEBUG nova.virt.libvirt.guest [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a5:56:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdc7318f2-54"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <name>instance-00000003</name>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <uuid>394ce10b-bae7-43fa-b133-df28182f99db</uuid>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:22:23</nova:creationTime>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:22:27 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <system>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='serial'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='uuid'>394ce10b-bae7-43fa-b133-df28182f99db</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </system>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <os>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </os>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <features>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </features>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk' index='2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/disk.config' index='1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:21:28:be'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target dev='tap7704ac5e-d3'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       </target>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db/console.log' append='off'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </console>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </input>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <video>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </video>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c326,c359</label>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c326,c359</imagelabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:22:27 compute-0 nova_compute[186241]: </domain>
Nov 25 06:22:27 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.944 186245 WARNING nova.virt.libvirt.driver [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Detaching interface fa:16:3e:a5:56:99 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapdc7318f2-54' not found.
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.944 186245 DEBUG nova.virt.libvirt.vif [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:21:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:21:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.945 186245 DEBUG nova.network.os_vif_util [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converting VIF {"id": "dc7318f2-544d-40fc-a3e1-24a837e45226", "address": "fa:16:3e:a5:56:99", "network": {"id": "b395525f-b7c1-4fad-a4fb-afb48a89a77b", "bridge": "br-int", "label": "tempest-network-smoke--105654638", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc7318f2-54", "ovs_interfaceid": "dc7318f2-544d-40fc-a3e1-24a837e45226", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.945 186245 DEBUG nova.network.os_vif_util [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.945 186245 DEBUG os_vif [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.946 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.947 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc7318f2-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.947 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.948 186245 INFO os_vif [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:56:99,bridge_name='br-int',has_traffic_filtering=True,id=dc7318f2-544d-40fc-a3e1-24a837e45226,network=Network(b395525f-b7c1-4fad-a4fb-afb48a89a77b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc7318f2-54')
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.949 186245 DEBUG nova.virt.driver [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-352064862', uuid='394ce10b-bae7-43fa-b133-df28182f99db'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051747.9492846) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:22:27 compute-0 nova_compute[186241]: 2025-11-25 06:22:27.949 186245 DEBUG nova.virt.libvirt.guest [req-37a9d291-d942-44b9-8faf-e14b468e757a req-ceb3a355-8865-4f4f-8891-3d280defcb33 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-352064862</nova:name>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:22:27</nova:creationTime>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     <nova:port uuid="7704ac5e-d3f5-484e-b018-096af3d84408">
Nov 25 06:22:27 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:22:27 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:22:27 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:22:27 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:22:27 compute-0 nova_compute[186241]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Nov 25 06:22:28 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:28.141 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.141 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:28 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:28.142 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.154 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.451 186245 DEBUG nova.compute.manager [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-changed-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.451 186245 DEBUG nova.compute.manager [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing instance network info cache due to event network-changed-7704ac5e-d3f5-484e-b018-096af3d84408. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.452 186245 DEBUG oslo_concurrency.lockutils [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.977 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.977 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.978 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.978 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.978 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:28 compute-0 nova_compute[186241]: 2025-11-25 06:22:28.979 186245 INFO nova.compute.manager [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Terminating instance
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.483 186245 DEBUG nova.compute.manager [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:22:29 compute-0 kernel: tap7704ac5e-d3 (unregistering): left promiscuous mode
Nov 25 06:22:29 compute-0 NetworkManager[55345]: <info>  [1764051749.5102] device (tap7704ac5e-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:22:29 compute-0 ovn_controller[95135]: 2025-11-25T06:22:29Z|00075|binding|INFO|Releasing lport 7704ac5e-d3f5-484e-b018-096af3d84408 from this chassis (sb_readonly=0)
Nov 25 06:22:29 compute-0 ovn_controller[95135]: 2025-11-25T06:22:29Z|00076|binding|INFO|Setting lport 7704ac5e-d3f5-484e-b018-096af3d84408 down in Southbound
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.516 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:29 compute-0 ovn_controller[95135]: 2025-11-25T06:22:29Z|00077|binding|INFO|Removing iface tap7704ac5e-d3 ovn-installed in OVS
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.529 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:28:be 10.100.0.8'], port_security=['fa:16:3e:21:28:be 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '394ce10b-bae7-43fa-b133-df28182f99db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '297c5270-251e-452e-ac3b-951ab3a33218', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad33700d-cdbb-45fc-843f-b6325c07b4bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=7704ac5e-d3f5-484e-b018-096af3d84408) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.530 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 7704ac5e-d3f5-484e-b018-096af3d84408 in datapath 48e22ff7-b3ad-4c32-9660-e2abd8947790 unbound from our chassis
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.531 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48e22ff7-b3ad-4c32-9660-e2abd8947790, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.531 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[89693a96-1ca8-45c7-a795-6c06a473ae8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.532 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790 namespace which is not needed anymore
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.541 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:29 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 25 06:22:29 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 12.310s CPU time.
Nov 25 06:22:29 compute-0 systemd-machined[152921]: Machine qemu-3-instance-00000003 terminated.
Nov 25 06:22:29 compute-0 podman[213454]: 2025-11-25 06:22:29.568635167 +0000 UTC m=+0.043374250 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_metadata_agent)
Nov 25 06:22:29 compute-0 neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790[212934]: [NOTICE]   (212938) : haproxy version is 2.8.14-c23fe91
Nov 25 06:22:29 compute-0 neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790[212934]: [NOTICE]   (212938) : path to executable is /usr/sbin/haproxy
Nov 25 06:22:29 compute-0 neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790[212934]: [WARNING]  (212938) : Exiting Master process...
Nov 25 06:22:29 compute-0 neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790[212934]: [ALERT]    (212938) : Current worker (212940) exited with code 143 (Terminated)
Nov 25 06:22:29 compute-0 neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790[212934]: [WARNING]  (212938) : All workers exited. Exiting... (0)
Nov 25 06:22:29 compute-0 podman[213492]: 2025-11-25 06:22:29.611145537 +0000 UTC m=+0.019427850 container kill 5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3)
Nov 25 06:22:29 compute-0 systemd[1]: libpod-5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c.scope: Deactivated successfully.
Nov 25 06:22:29 compute-0 podman[213503]: 2025-11-25 06:22:29.645276683 +0000 UTC m=+0.019822825 container died 5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:22:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c-userdata-shm.mount: Deactivated successfully.
Nov 25 06:22:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-ed0b142be243ca2eb3e10ae11550ff6f62f0739263890452d449cfd97506ef67-merged.mount: Deactivated successfully.
Nov 25 06:22:29 compute-0 podman[213503]: 2025-11-25 06:22:29.664111553 +0000 UTC m=+0.038657675 container cleanup 5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:22:29 compute-0 systemd[1]: libpod-conmon-5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c.scope: Deactivated successfully.
Nov 25 06:22:29 compute-0 podman[213505]: 2025-11-25 06:22:29.674897063 +0000 UTC m=+0.045618782 container remove 5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.678 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8dae0dd6-ed3d-414a-ab5c-37f27eeb5b29]: (4, ("Tue Nov 25 06:22:29 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790 (5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c)\n5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c\nTue Nov 25 06:22:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790 (5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c)\n5e2d7182fe7ac99b9c774d258919667e0cca19743615379c52ee3bc021156e9c\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.679 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7243d903-d02d-4a39-86d0-2a37c53321ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.679 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48e22ff7-b3ad-4c32-9660-e2abd8947790.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.679 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8e7ef0-eba4-4bfc-af3b-63600f73a433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.680 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48e22ff7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.681 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:29 compute-0 kernel: tap48e22ff7-b0: left promiscuous mode
Nov 25 06:22:29 compute-0 NetworkManager[55345]: <info>  [1764051749.6986] manager: (tap7704ac5e-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.698 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.699 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.701 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0c0f50-5ef2-4a02-b439-511d602734d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.711 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[422d1b87-2b16-44cf-a420-b989ecdc61df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.712 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e96a0b53-0248-4bf2-a072-37b63607813e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.726 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b62cf48e-f8ec-4147-8d13-a87d38239557]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 266486, 'reachable_time': 36311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213541, 'error': None, 'target': 'ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 systemd[1]: run-netns-ovnmeta\x2d48e22ff7\x2db3ad\x2d4c32\x2d9660\x2de2abd8947790.mount: Deactivated successfully.
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.728 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48e22ff7-b3ad-4c32-9660-e2abd8947790 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:22:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:29.728 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[68b9d3c5-5508-47d6-90ca-81cbe1d4a9d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.730 186245 INFO nova.virt.libvirt.driver [-] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Instance destroyed successfully.
Nov 25 06:22:29 compute-0 nova_compute[186241]: 2025-11-25 06:22:29.730 186245 DEBUG nova.objects.instance [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 394ce10b-bae7-43fa-b133-df28182f99db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.233 186245 DEBUG nova.virt.libvirt.vif [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-352064862',display_name='tempest-TestNetworkBasicOps-server-352064862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-352064862',id=3,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5eRUy74odxp9Q2Am4HiDIkMdvRYPpw1VUK3zfp+EbN2Ota/jKN8edSaGUzCIGEJamacDqcH0lJ6H/skO0Xvp6BAJvgTjvLUerS98Msbl+Qa+0/i1uo7EnhHPR93WCglQ==',key_name='tempest-TestNetworkBasicOps-1679774923',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:21:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-0mx851oz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:21:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=394ce10b-bae7-43fa-b133-df28182f99db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.234 186245 DEBUG nova.network.os_vif_util [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.234 186245 DEBUG nova.network.os_vif_util [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:28:be,bridge_name='br-int',has_traffic_filtering=True,id=7704ac5e-d3f5-484e-b018-096af3d84408,network=Network(48e22ff7-b3ad-4c32-9660-e2abd8947790),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7704ac5e-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.235 186245 DEBUG os_vif [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:28:be,bridge_name='br-int',has_traffic_filtering=True,id=7704ac5e-d3f5-484e-b018-096af3d84408,network=Network(48e22ff7-b3ad-4c32-9660-e2abd8947790),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7704ac5e-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.236 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.236 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7704ac5e-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.237 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.238 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.239 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.239 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=44cfdfb1-52ac-45e6-8670-4ddeb88ae522) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.240 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.240 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.241 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.243 186245 INFO os_vif [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:28:be,bridge_name='br-int',has_traffic_filtering=True,id=7704ac5e-d3f5-484e-b018-096af3d84408,network=Network(48e22ff7-b3ad-4c32-9660-e2abd8947790),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7704ac5e-d3')
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.243 186245 INFO nova.virt.libvirt.driver [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Deleting instance files /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db_del
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.244 186245 INFO nova.virt.libvirt.driver [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Deletion of /var/lib/nova/instances/394ce10b-bae7-43fa-b133-df28182f99db_del complete
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.600 186245 DEBUG nova.compute.manager [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-unplugged-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.600 186245 DEBUG oslo_concurrency.lockutils [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG oslo_concurrency.lockutils [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG oslo_concurrency.lockutils [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG nova.compute.manager [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] No waiting events found dispatching network-vif-unplugged-7704ac5e-d3f5-484e-b018-096af3d84408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG nova.compute.manager [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-unplugged-7704ac5e-d3f5-484e-b018-096af3d84408 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG nova.compute.manager [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG oslo_concurrency.lockutils [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "394ce10b-bae7-43fa-b133-df28182f99db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG oslo_concurrency.lockutils [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.601 186245 DEBUG oslo_concurrency.lockutils [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.602 186245 DEBUG nova.compute.manager [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] No waiting events found dispatching network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.602 186245 WARNING nova.compute.manager [req-7235d4ea-32b5-4ddb-8d9a-5c897196d185 req-3eb8ec8a-9320-4425-b086-377bed3ede6f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received unexpected event network-vif-plugged-7704ac5e-d3f5-484e-b018-096af3d84408 for instance with vm_state active and task_state deleting.
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.752 186245 INFO nova.compute.manager [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Took 1.27 seconds to destroy the instance on the hypervisor.
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.753 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.753 186245 DEBUG nova.compute.manager [-] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:22:30 compute-0 nova_compute[186241]: 2025-11-25 06:22:30.753 186245 DEBUG nova.network.neutron [-] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:22:31 compute-0 nova_compute[186241]: 2025-11-25 06:22:31.260 186245 DEBUG nova.network.neutron [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [{"id": "7704ac5e-d3f5-484e-b018-096af3d84408", "address": "fa:16:3e:21:28:be", "network": {"id": "48e22ff7-b3ad-4c32-9660-e2abd8947790", "bridge": "br-int", "label": "tempest-network-smoke--1925175189", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7704ac5e-d3", "ovs_interfaceid": "7704ac5e-d3f5-484e-b018-096af3d84408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:22:31 compute-0 nova_compute[186241]: 2025-11-25 06:22:31.764 186245 DEBUG oslo_concurrency.lockutils [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:22:31 compute-0 nova_compute[186241]: 2025-11-25 06:22:31.765 186245 DEBUG oslo_concurrency.lockutils [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:22:31 compute-0 nova_compute[186241]: 2025-11-25 06:22:31.766 186245 DEBUG nova.network.neutron [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Refreshing network info cache for port 7704ac5e-d3f5-484e-b018-096af3d84408 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:22:31 compute-0 nova_compute[186241]: 2025-11-25 06:22:31.927 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.270 186245 DEBUG oslo_concurrency.lockutils [None req-a1de6261-123b-4b1c-be69-47d8f23a6005 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-394ce10b-bae7-43fa-b133-df28182f99db-dc7318f2-544d-40fc-a3e1-24a837e45226" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 30.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.340 186245 DEBUG nova.compute.manager [req-63cc2d79-539c-4bb2-9ee9-9c3e7e7d4d35 req-8f8b4f9f-3ebb-49ac-8d61-3860f8fc6c2f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Received event network-vif-deleted-7704ac5e-d3f5-484e-b018-096af3d84408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.340 186245 INFO nova.compute.manager [req-63cc2d79-539c-4bb2-9ee9-9c3e7e7d4d35 req-8f8b4f9f-3ebb-49ac-8d61-3860f8fc6c2f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Neutron deleted interface 7704ac5e-d3f5-484e-b018-096af3d84408; detaching it from the instance and deleting it from the info cache
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.340 186245 DEBUG nova.network.neutron [req-63cc2d79-539c-4bb2-9ee9-9c3e7e7d4d35 req-8f8b4f9f-3ebb-49ac-8d61-3860f8fc6c2f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.578 186245 INFO nova.network.neutron [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Port 7704ac5e-d3f5-484e-b018-096af3d84408 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.578 186245 DEBUG nova.network.neutron [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.696 186245 DEBUG nova.network.neutron [-] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:22:32 compute-0 nova_compute[186241]: 2025-11-25 06:22:32.844 186245 DEBUG nova.compute.manager [req-63cc2d79-539c-4bb2-9ee9-9c3e7e7d4d35 req-8f8b4f9f-3ebb-49ac-8d61-3860f8fc6c2f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Detach interface failed, port_id=7704ac5e-d3f5-484e-b018-096af3d84408, reason: Instance 394ce10b-bae7-43fa-b133-df28182f99db could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:22:33 compute-0 nova_compute[186241]: 2025-11-25 06:22:33.081 186245 DEBUG oslo_concurrency.lockutils [req-cd66bf94-9e0c-430b-8b97-334c20adf843 req-644fe9e7-de02-41a9-870c-435c111426d7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-394ce10b-bae7-43fa-b133-df28182f99db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:22:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:33.143 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:22:33 compute-0 nova_compute[186241]: 2025-11-25 06:22:33.198 186245 INFO nova.compute.manager [-] [instance: 394ce10b-bae7-43fa-b133-df28182f99db] Took 2.45 seconds to deallocate network for instance.
Nov 25 06:22:33 compute-0 nova_compute[186241]: 2025-11-25 06:22:33.704 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:33 compute-0 nova_compute[186241]: 2025-11-25 06:22:33.704 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:33 compute-0 nova_compute[186241]: 2025-11-25 06:22:33.743 186245 DEBUG nova.compute.provider_tree [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:22:34 compute-0 podman[213547]: 2025-11-25 06:22:34.08999109 +0000 UTC m=+0.068618337 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Nov 25 06:22:34 compute-0 nova_compute[186241]: 2025-11-25 06:22:34.246 186245 DEBUG nova.scheduler.client.report [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:22:34 compute-0 nova_compute[186241]: 2025-11-25 06:22:34.751 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:34 compute-0 nova_compute[186241]: 2025-11-25 06:22:34.767 186245 INFO nova.scheduler.client.report [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 394ce10b-bae7-43fa-b133-df28182f99db
Nov 25 06:22:35 compute-0 nova_compute[186241]: 2025-11-25 06:22:35.240 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:35 compute-0 nova_compute[186241]: 2025-11-25 06:22:35.775 186245 DEBUG oslo_concurrency.lockutils [None req-ab751794-c63c-4063-957d-97579b2345e7 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "394ce10b-bae7-43fa-b133-df28182f99db" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:36 compute-0 nova_compute[186241]: 2025-11-25 06:22:36.929 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:40 compute-0 podman[213565]: 2025-11-25 06:22:40.063892467 +0000 UTC m=+0.043089502 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 25 06:22:40 compute-0 nova_compute[186241]: 2025-11-25 06:22:40.241 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:40 compute-0 nova_compute[186241]: 2025-11-25 06:22:40.272 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:40 compute-0 nova_compute[186241]: 2025-11-25 06:22:40.348 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:41 compute-0 nova_compute[186241]: 2025-11-25 06:22:41.929 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:45 compute-0 podman[213583]: 2025-11-25 06:22:45.05708372 +0000 UTC m=+0.036013918 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:22:45 compute-0 nova_compute[186241]: 2025-11-25 06:22:45.243 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:46.358 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:19:5e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5f4b8f2-91df-4e6d-a0cc-17ea8a984247, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e4302f31-94aa-43d3-9e59-7579149e9537) old=Port_Binding(mac=['fa:16:3e:27:19:5e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:22:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:46.359 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e4302f31-94aa-43d3-9e59-7579149e9537 in datapath f088e83e-6869-485f-aff5-47d816c267b4 updated
Nov 25 06:22:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:46.360 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f088e83e-6869-485f-aff5-47d816c267b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:22:46 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:46.361 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[591c6b44-b2a5-48bc-b5c1-6bafa0529c23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:22:46 compute-0 nova_compute[186241]: 2025-11-25 06:22:46.930 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:47.415 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:22:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:47.416 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:22:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:22:47.416 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:22:50 compute-0 nova_compute[186241]: 2025-11-25 06:22:50.244 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:51 compute-0 nova_compute[186241]: 2025-11-25 06:22:51.932 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:53 compute-0 podman[213605]: 2025-11-25 06:22:53.084109851 +0000 UTC m=+0.058811023 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:22:55 compute-0 nova_compute[186241]: 2025-11-25 06:22:55.246 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:56 compute-0 nova_compute[186241]: 2025-11-25 06:22:56.932 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:22:57 compute-0 podman[213629]: 2025-11-25 06:22:57.061863936 +0000 UTC m=+0.040066325 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:22:57 compute-0 podman[213628]: 2025-11-25 06:22:57.065920237 +0000 UTC m=+0.044984113 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 06:23:00 compute-0 podman[213666]: 2025-11-25 06:23:00.05813572 +0000 UTC m=+0.033522623 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 06:23:00 compute-0 nova_compute[186241]: 2025-11-25 06:23:00.248 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:01 compute-0 nova_compute[186241]: 2025-11-25 06:23:01.638 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:01 compute-0 nova_compute[186241]: 2025-11-25 06:23:01.638 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:01 compute-0 nova_compute[186241]: 2025-11-25 06:23:01.934 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:02 compute-0 nova_compute[186241]: 2025-11-25 06:23:02.141 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:23:02 compute-0 nova_compute[186241]: 2025-11-25 06:23:02.670 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:02 compute-0 nova_compute[186241]: 2025-11-25 06:23:02.671 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:02 compute-0 nova_compute[186241]: 2025-11-25 06:23:02.676 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:23:02 compute-0 nova_compute[186241]: 2025-11-25 06:23:02.676 186245 INFO nova.compute.claims [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:23:03 compute-0 nova_compute[186241]: 2025-11-25 06:23:03.713 186245 DEBUG nova.compute.provider_tree [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:23:04 compute-0 nova_compute[186241]: 2025-11-25 06:23:04.217 186245 DEBUG nova.scheduler.client.report [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:23:04 compute-0 nova_compute[186241]: 2025-11-25 06:23:04.722 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:04 compute-0 nova_compute[186241]: 2025-11-25 06:23:04.722 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:23:05 compute-0 podman[213682]: 2025-11-25 06:23:05.066991795 +0000 UTC m=+0.045101375 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 06:23:05 compute-0 nova_compute[186241]: 2025-11-25 06:23:05.228 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:23:05 compute-0 nova_compute[186241]: 2025-11-25 06:23:05.229 186245 DEBUG nova.network.neutron [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:23:05 compute-0 nova_compute[186241]: 2025-11-25 06:23:05.249 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:05 compute-0 nova_compute[186241]: 2025-11-25 06:23:05.561 186245 DEBUG nova.policy [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:23:05 compute-0 nova_compute[186241]: 2025-11-25 06:23:05.732 186245 INFO nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:23:06 compute-0 nova_compute[186241]: 2025-11-25 06:23:06.236 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:23:06 compute-0 nova_compute[186241]: 2025-11-25 06:23:06.793 186245 DEBUG nova.network.neutron [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Successfully created port: 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:23:06 compute-0 nova_compute[186241]: 2025-11-25 06:23:06.935 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.247 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.248 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.248 186245 INFO nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Creating image(s)
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.249 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.249 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.250 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.250 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.253 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.254 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.298 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.299 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.300 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.300 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.303 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.303 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.347 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.348 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.367 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.368 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.368 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.411 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.411 186245 DEBUG nova.virt.disk.api [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.412 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.456 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.457 186245 DEBUG nova.virt.disk.api [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.457 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.458 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Ensure instance console log exists: /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.458 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.458 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.459 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.566 186245 DEBUG nova.network.neutron [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Successfully updated port: 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.745 186245 DEBUG nova.compute.manager [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-changed-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.745 186245 DEBUG nova.compute.manager [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Refreshing instance network info cache due to event network-changed-17e014dc-1fe1-4091-95b3-3c08eb9abbb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.745 186245 DEBUG oslo_concurrency.lockutils [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.745 186245 DEBUG oslo_concurrency.lockutils [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:23:07 compute-0 nova_compute[186241]: 2025-11-25 06:23:07.746 186245 DEBUG nova.network.neutron [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Refreshing network info cache for port 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:23:08 compute-0 nova_compute[186241]: 2025-11-25 06:23:08.069 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:23:09 compute-0 nova_compute[186241]: 2025-11-25 06:23:09.205 186245 DEBUG nova.network.neutron [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:23:10 compute-0 nova_compute[186241]: 2025-11-25 06:23:10.250 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:10 compute-0 nova_compute[186241]: 2025-11-25 06:23:10.395 186245 DEBUG nova.network.neutron [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:23:10 compute-0 nova_compute[186241]: 2025-11-25 06:23:10.898 186245 DEBUG oslo_concurrency.lockutils [req-a723a46c-e7c2-412a-992b-7948ae09de10 req-45a834df-e872-480b-a7aa-7b8780c34e47 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:23:10 compute-0 nova_compute[186241]: 2025-11-25 06:23:10.898 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:23:10 compute-0 nova_compute[186241]: 2025-11-25 06:23:10.899 186245 DEBUG nova.network.neutron [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:23:11 compute-0 podman[213715]: 2025-11-25 06:23:11.063103445 +0000 UTC m=+0.042606282 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:23:11 compute-0 nova_compute[186241]: 2025-11-25 06:23:11.676 186245 DEBUG nova.network.neutron [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:23:11 compute-0 nova_compute[186241]: 2025-11-25 06:23:11.935 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:13 compute-0 nova_compute[186241]: 2025-11-25 06:23:13.684 186245 DEBUG nova.network.neutron [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updating instance_info_cache with network_info: [{"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.187 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.188 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Instance network_info: |[{"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.190 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Start _get_guest_xml network_info=[{"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.193 186245 WARNING nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.193 186245 DEBUG nova.virt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-112305377', uuid='b79ea1d7-d6e1-430b-82bf-566447f159f3'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051794.1938295) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.199 186245 DEBUG nova.virt.libvirt.host [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.200 186245 DEBUG nova.virt.libvirt.host [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.202 186245 DEBUG nova.virt.libvirt.host [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.202 186245 DEBUG nova.virt.libvirt.host [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.203 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.203 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.203 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.203 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.204 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.204 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.204 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.204 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.204 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.205 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.205 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.205 186245 DEBUG nova.virt.hardware [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.207 186245 DEBUG nova.virt.libvirt.vif [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-112305377',display_name='tempest-TestNetworkBasicOps-server-112305377',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-112305377',id=4,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOcCCM5XX449ERaNbK92qLPvVLH1Xsp1m2F1vTT92DeaMDd7WOtWX4CV3c9DgYE1GaAP6//Jn1dzZvGo29HLczF+oNP7IRiMbkWTtn2RSSpZ1JyMvXiH3LfhFpiCqACiqw==',key_name='tempest-TestNetworkBasicOps-2120882882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-zao59wro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:23:06Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=b79ea1d7-d6e1-430b-82bf-566447f159f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.208 186245 DEBUG nova.network.os_vif_util [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.208 186245 DEBUG nova.network.os_vif_util [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:4d:aa,bridge_name='br-int',has_traffic_filtering=True,id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17e014dc-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.209 186245 DEBUG nova.objects.instance [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid b79ea1d7-d6e1-430b-82bf-566447f159f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.713 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <uuid>b79ea1d7-d6e1-430b-82bf-566447f159f3</uuid>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <name>instance-00000004</name>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-112305377</nova:name>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:23:14</nova:creationTime>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         <nova:port uuid="17e014dc-1fe1-4091-95b3-3c08eb9abbb2">
Nov 25 06:23:14 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <system>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <entry name="serial">b79ea1d7-d6e1-430b-82bf-566447f159f3</entry>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <entry name="uuid">b79ea1d7-d6e1-430b-82bf-566447f159f3</entry>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </system>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <os>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   </os>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <features>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   </features>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.config"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:44:4d:aa"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <target dev="tap17e014dc-1f"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/console.log" append="off"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <video>
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </video>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:23:14 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:23:14 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:23:14 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:23:14 compute-0 nova_compute[186241]: </domain>
Nov 25 06:23:14 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.714 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Preparing to wait for external event network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.714 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.714 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.714 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.715 186245 DEBUG nova.virt.libvirt.vif [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-112305377',display_name='tempest-TestNetworkBasicOps-server-112305377',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-112305377',id=4,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOcCCM5XX449ERaNbK92qLPvVLH1Xsp1m2F1vTT92DeaMDd7WOtWX4CV3c9DgYE1GaAP6//Jn1dzZvGo29HLczF+oNP7IRiMbkWTtn2RSSpZ1JyMvXiH3LfhFpiCqACiqw==',key_name='tempest-TestNetworkBasicOps-2120882882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-zao59wro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:23:06Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=b79ea1d7-d6e1-430b-82bf-566447f159f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.715 186245 DEBUG nova.network.os_vif_util [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.716 186245 DEBUG nova.network.os_vif_util [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:4d:aa,bridge_name='br-int',has_traffic_filtering=True,id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17e014dc-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.716 186245 DEBUG os_vif [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:4d:aa,bridge_name='br-int',has_traffic_filtering=True,id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17e014dc-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.716 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.717 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.717 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.717 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.718 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c3a12c4f-2938-5806-967f-0ca164e6c841', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.721 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.723 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.723 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17e014dc-1f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.723 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap17e014dc-1f, col_values=(('qos', UUID('085dad69-ccf0-424b-bc4e-53f583c9421d')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.724 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap17e014dc-1f, col_values=(('external_ids', {'iface-id': '17e014dc-1fe1-4091-95b3-3c08eb9abbb2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:4d:aa', 'vm-uuid': 'b79ea1d7-d6e1-430b-82bf-566447f159f3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:14 compute-0 NetworkManager[55345]: <info>  [1764051794.7256] manager: (tap17e014dc-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.725 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.726 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.728 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:14 compute-0 nova_compute[186241]: 2025-11-25 06:23:14.728 186245 INFO os_vif [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:4d:aa,bridge_name='br-int',has_traffic_filtering=True,id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17e014dc-1f')
Nov 25 06:23:16 compute-0 podman[213735]: 2025-11-25 06:23:16.058907436 +0000 UTC m=+0.036960539 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 25 06:23:16 compute-0 nova_compute[186241]: 2025-11-25 06:23:16.252 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:23:16 compute-0 nova_compute[186241]: 2025-11-25 06:23:16.253 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:23:16 compute-0 nova_compute[186241]: 2025-11-25 06:23:16.253 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:44:4d:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:23:16 compute-0 nova_compute[186241]: 2025-11-25 06:23:16.253 186245 INFO nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Using config drive
Nov 25 06:23:16 compute-0 nova_compute[186241]: 2025-11-25 06:23:16.937 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 ovn_controller[95135]: 2025-11-25T06:23:17Z|00078|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.444 186245 INFO nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Creating config drive at /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.config
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.449 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpvnmmbtkz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.566 186245 DEBUG oslo_concurrency.processutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpvnmmbtkz" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:17 compute-0 kernel: tap17e014dc-1f: entered promiscuous mode
Nov 25 06:23:17 compute-0 NetworkManager[55345]: <info>  [1764051797.6031] manager: (tap17e014dc-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Nov 25 06:23:17 compute-0 ovn_controller[95135]: 2025-11-25T06:23:17Z|00079|binding|INFO|Claiming lport 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 for this chassis.
Nov 25 06:23:17 compute-0 ovn_controller[95135]: 2025-11-25T06:23:17Z|00080|binding|INFO|17e014dc-1fe1-4091-95b3-3c08eb9abbb2: Claiming fa:16:3e:44:4d:aa 10.100.0.3
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.605 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.611 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.616 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:4d:aa 10.100.0.3'], port_security=['fa:16:3e:44:4d:aa 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b79ea1d7-d6e1-430b-82bf-566447f159f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '90a196c4-6984-431f-afc3-bb9d2e72304f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5f4b8f2-91df-4e6d-a0cc-17ea8a984247, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=17e014dc-1fe1-4091-95b3-3c08eb9abbb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.617 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 in datapath f088e83e-6869-485f-aff5-47d816c267b4 bound to our chassis
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.618 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f088e83e-6869-485f-aff5-47d816c267b4
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.626 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7fa99f2d-25f5-4ab7-87ee-6103a7cbdb3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.627 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf088e83e-61 in ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.630 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf088e83e-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.630 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[16e02be1-c7e1-4090-be1a-a787dbf727cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.630 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a9d859-96b1-4cd9-9e37-f51975add4e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 systemd-machined[152921]: New machine qemu-4-instance-00000004.
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.637 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[34c89c87-18d1-47b1-aa38-72d1d25b862f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 systemd-udevd[213778]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:23:17 compute-0 NetworkManager[55345]: <info>  [1764051797.6483] device (tap17e014dc-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:23:17 compute-0 NetworkManager[55345]: <info>  [1764051797.6492] device (tap17e014dc-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:23:17 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.666 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8719aaed-3611-4b65-a474-78ba2eeb4d75]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_controller[95135]: 2025-11-25T06:23:17Z|00081|binding|INFO|Setting lport 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 ovn-installed in OVS
Nov 25 06:23:17 compute-0 ovn_controller[95135]: 2025-11-25T06:23:17Z|00082|binding|INFO|Setting lport 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 up in Southbound
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.671 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.685 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc59be7-b4ab-430f-a01d-5a1d83960444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 NetworkManager[55345]: <info>  [1764051797.6893] manager: (tapf088e83e-60): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.689 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[66f4dddb-c244-444f-9271-b1f61c3bda6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.710 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8dbc29-9e60-45c1-b5b5-a1420b61f890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.712 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[baf87aaf-4237-408e-aa04-d61d5bf0e335]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 NetworkManager[55345]: <info>  [1764051797.7269] device (tapf088e83e-60): carrier: link connected
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.730 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[6cef0c0a-ced2-4499-a3c1-ae27a896d361]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.742 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b44c6d-ab40-4a76-bcec-2367692bdcd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf088e83e-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:19:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 278613, 'reachable_time': 37864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213801, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.754 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef1df7b-9965-48af-b174-b535cf79d144]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:195e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 278613, 'tstamp': 278613}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213802, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.765 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b52ac1a7-d84e-4418-b04a-4880c4fa5b57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf088e83e-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:19:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 278613, 'reachable_time': 37864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213803, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.786 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb5264a-91c5-4e1f-8fd6-ead88f9ad371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.824 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c3627eb9-c8d1-47d9-9110-8bba33357612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.825 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf088e83e-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.825 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.826 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf088e83e-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:17 compute-0 NetworkManager[55345]: <info>  [1764051797.8277] manager: (tapf088e83e-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 25 06:23:17 compute-0 kernel: tapf088e83e-60: entered promiscuous mode
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.827 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.830 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf088e83e-60, col_values=(('external_ids', {'iface-id': 'e4302f31-94aa-43d3-9e59-7579149e9537'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.831 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 ovn_controller[95135]: 2025-11-25T06:23:17Z|00083|binding|INFO|Releasing lport e4302f31-94aa-43d3-9e59-7579149e9537 from this chassis (sb_readonly=0)
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.843 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.844 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.845 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ebba1913-1cff-4e66-a796-629c245faa42]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.845 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.845 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.845 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f088e83e-6869-485f-aff5-47d816c267b4 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.845 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.846 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[773411b6-1bd8-494f-ac9b-819a58d38375]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.846 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.846 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a752bd0a-d30f-4388-8f98-0e3376b6f815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.847 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-f088e83e-6869-485f-aff5-47d816c267b4
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID f088e83e-6869-485f-aff5-47d816c267b4
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:23:17 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:17.847 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'env', 'PROCESS_TAG=haproxy-f088e83e-6869-485f-aff5-47d816c267b4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f088e83e-6869-485f-aff5-47d816c267b4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.874 186245 DEBUG nova.compute.manager [req-d6531150-6e42-4917-94d8-22865eeaf66e req-4b5e3a20-ca60-4e37-826f-65e6ce1a557f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.875 186245 DEBUG oslo_concurrency.lockutils [req-d6531150-6e42-4917-94d8-22865eeaf66e req-4b5e3a20-ca60-4e37-826f-65e6ce1a557f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.875 186245 DEBUG oslo_concurrency.lockutils [req-d6531150-6e42-4917-94d8-22865eeaf66e req-4b5e3a20-ca60-4e37-826f-65e6ce1a557f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.875 186245 DEBUG oslo_concurrency.lockutils [req-d6531150-6e42-4917-94d8-22865eeaf66e req-4b5e3a20-ca60-4e37-826f-65e6ce1a557f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.875 186245 DEBUG nova.compute.manager [req-d6531150-6e42-4917-94d8-22865eeaf66e req-4b5e3a20-ca60-4e37-826f-65e6ce1a557f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Processing event network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.919 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.921 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.928 186245 INFO nova.virt.libvirt.driver [-] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Instance spawned successfully.
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.928 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:23:17 compute-0 nova_compute[186241]: 2025-11-25 06:23:17.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:18 compute-0 podman[213839]: 2025-11-25 06:23:18.131244036 +0000 UTC m=+0.029688241 container create 623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 06:23:18 compute-0 systemd[1]: Started libpod-conmon-623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8.scope.
Nov 25 06:23:18 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:23:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16c10de93db04b69bc31468fbdf4d084f6cea2f537dfc210e64f6c334d764c62/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:23:18 compute-0 podman[213839]: 2025-11-25 06:23:18.185114715 +0000 UTC m=+0.083558921 container init 623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:23:18 compute-0 podman[213839]: 2025-11-25 06:23:18.190144516 +0000 UTC m=+0.088588722 container start 623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:23:18 compute-0 podman[213839]: 2025-11-25 06:23:18.117431283 +0000 UTC m=+0.015875509 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:23:18 compute-0 neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4[213852]: [NOTICE]   (213856) : New worker (213858) forked
Nov 25 06:23:18 compute-0 neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4[213852]: [NOTICE]   (213856) : Loading success.
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.437 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.437 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.438 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.438 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.438 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.439 186245 DEBUG nova.virt.libvirt.driver [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.944 186245 INFO nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Took 11.70 seconds to spawn the instance on the hypervisor.
Nov 25 06:23:18 compute-0 nova_compute[186241]: 2025-11-25 06:23:18.944 186245 DEBUG nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:23:19 compute-0 nova_compute[186241]: 2025-11-25 06:23:19.456 186245 INFO nova.compute.manager [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Took 16.81 seconds to build instance.
Nov 25 06:23:19 compute-0 nova_compute[186241]: 2025-11-25 06:23:19.726 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:19 compute-0 nova_compute[186241]: 2025-11-25 06:23:19.958 186245 DEBUG oslo_concurrency.lockutils [None req-3496b0ab-9470-41a9-9a45-565eb50dfb17 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:20 compute-0 nova_compute[186241]: 2025-11-25 06:23:20.038 186245 DEBUG nova.compute.manager [req-aa23efd3-689d-46df-9ec7-013893bb1f1a req-6490a1c9-8414-407b-b698-c3114de21390 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:23:20 compute-0 nova_compute[186241]: 2025-11-25 06:23:20.038 186245 DEBUG oslo_concurrency.lockutils [req-aa23efd3-689d-46df-9ec7-013893bb1f1a req-6490a1c9-8414-407b-b698-c3114de21390 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:20 compute-0 nova_compute[186241]: 2025-11-25 06:23:20.038 186245 DEBUG oslo_concurrency.lockutils [req-aa23efd3-689d-46df-9ec7-013893bb1f1a req-6490a1c9-8414-407b-b698-c3114de21390 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:20 compute-0 nova_compute[186241]: 2025-11-25 06:23:20.039 186245 DEBUG oslo_concurrency.lockutils [req-aa23efd3-689d-46df-9ec7-013893bb1f1a req-6490a1c9-8414-407b-b698-c3114de21390 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:20 compute-0 nova_compute[186241]: 2025-11-25 06:23:20.039 186245 DEBUG nova.compute.manager [req-aa23efd3-689d-46df-9ec7-013893bb1f1a req-6490a1c9-8414-407b-b698-c3114de21390 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] No waiting events found dispatching network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:23:20 compute-0 nova_compute[186241]: 2025-11-25 06:23:20.039 186245 WARNING nova.compute.manager [req-aa23efd3-689d-46df-9ec7-013893bb1f1a req-6490a1c9-8414-407b-b698-c3114de21390 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received unexpected event network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 for instance with vm_state active and task_state None.
Nov 25 06:23:21 compute-0 nova_compute[186241]: 2025-11-25 06:23:21.928 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:21 compute-0 nova_compute[186241]: 2025-11-25 06:23:21.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:21 compute-0 nova_compute[186241]: 2025-11-25 06:23:21.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:21 compute-0 nova_compute[186241]: 2025-11-25 06:23:21.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:21 compute-0 nova_compute[186241]: 2025-11-25 06:23:21.939 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:22 compute-0 nova_compute[186241]: 2025-11-25 06:23:22.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:22 compute-0 nova_compute[186241]: 2025-11-25 06:23:22.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:22 compute-0 nova_compute[186241]: 2025-11-25 06:23:22.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:22 compute-0 nova_compute[186241]: 2025-11-25 06:23:22.439 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:23:23 compute-0 ovn_controller[95135]: 2025-11-25T06:23:23Z|00084|binding|INFO|Releasing lport e4302f31-94aa-43d3-9e59-7579149e9537 from this chassis (sb_readonly=0)
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.431 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:23 compute-0 NetworkManager[55345]: <info>  [1764051803.4339] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 25 06:23:23 compute-0 NetworkManager[55345]: <info>  [1764051803.4345] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Nov 25 06:23:23 compute-0 ovn_controller[95135]: 2025-11-25T06:23:23Z|00085|binding|INFO|Releasing lport e4302f31-94aa-43d3-9e59-7579149e9537 from this chassis (sb_readonly=0)
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.464 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.465 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.521 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.522 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.577 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.769 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.770 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5636MB free_disk=73.02103805541992GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.770 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.771 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.860 186245 DEBUG nova.compute.manager [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-changed-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.860 186245 DEBUG nova.compute.manager [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Refreshing instance network info cache due to event network-changed-17e014dc-1fe1-4091-95b3-3c08eb9abbb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.860 186245 DEBUG oslo_concurrency.lockutils [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.861 186245 DEBUG oslo_concurrency.lockutils [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:23:23 compute-0 nova_compute[186241]: 2025-11-25 06:23:23.861 186245 DEBUG nova.network.neutron [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Refreshing network info cache for port 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:23:24 compute-0 podman[213871]: 2025-11-25 06:23:24.080239566 +0000 UTC m=+0.058686778 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 25 06:23:24 compute-0 nova_compute[186241]: 2025-11-25 06:23:24.727 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:24 compute-0 nova_compute[186241]: 2025-11-25 06:23:24.804 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance b79ea1d7-d6e1-430b-82bf-566447f159f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:23:24 compute-0 nova_compute[186241]: 2025-11-25 06:23:24.805 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:23:24 compute-0 nova_compute[186241]: 2025-11-25 06:23:24.805 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:23:24 compute-0 nova_compute[186241]: 2025-11-25 06:23:24.835 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:23:25 compute-0 nova_compute[186241]: 2025-11-25 06:23:25.339 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:23:25 compute-0 nova_compute[186241]: 2025-11-25 06:23:25.845 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:23:25 compute-0 nova_compute[186241]: 2025-11-25 06:23:25.846 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:26 compute-0 nova_compute[186241]: 2025-11-25 06:23:26.846 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:26 compute-0 nova_compute[186241]: 2025-11-25 06:23:26.940 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:27 compute-0 nova_compute[186241]: 2025-11-25 06:23:27.351 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:27 compute-0 nova_compute[186241]: 2025-11-25 06:23:27.352 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:23:27 compute-0 nova_compute[186241]: 2025-11-25 06:23:27.352 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:23:27 compute-0 nova_compute[186241]: 2025-11-25 06:23:27.519 186245 DEBUG nova.network.neutron [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updated VIF entry in instance network info cache for port 17e014dc-1fe1-4091-95b3-3c08eb9abbb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:23:27 compute-0 nova_compute[186241]: 2025-11-25 06:23:27.519 186245 DEBUG nova.network.neutron [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updating instance_info_cache with network_info: [{"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:23:28 compute-0 nova_compute[186241]: 2025-11-25 06:23:28.023 186245 DEBUG oslo_concurrency.lockutils [req-542c60ea-97b8-4b09-9136-7c7296eb3e0a req-29721f41-3814-4ef3-b18e-4582aec32885 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:23:28 compute-0 podman[213904]: 2025-11-25 06:23:28.070993949 +0000 UTC m=+0.044281187 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:23:28 compute-0 podman[213903]: 2025-11-25 06:23:28.0731226 +0000 UTC m=+0.048450252 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:23:28 compute-0 ovn_controller[95135]: 2025-11-25T06:23:28Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:4d:aa 10.100.0.3
Nov 25 06:23:28 compute-0 ovn_controller[95135]: 2025-11-25T06:23:28Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:4d:aa 10.100.0.3
Nov 25 06:23:29 compute-0 nova_compute[186241]: 2025-11-25 06:23:29.731 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:31 compute-0 podman[213945]: 2025-11-25 06:23:31.086986817 +0000 UTC m=+0.066388657 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 06:23:31 compute-0 nova_compute[186241]: 2025-11-25 06:23:31.944 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:34 compute-0 nova_compute[186241]: 2025-11-25 06:23:34.181 186245 INFO nova.compute.manager [None req-c1b699c0-6cbc-4508-821d-7171cab5b4a3 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Get console output
Nov 25 06:23:34 compute-0 nova_compute[186241]: 2025-11-25 06:23:34.184 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:23:34 compute-0 nova_compute[186241]: 2025-11-25 06:23:34.734 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:36 compute-0 podman[213962]: 2025-11-25 06:23:36.061951082 +0000 UTC m=+0.040455470 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Nov 25 06:23:36 compute-0 nova_compute[186241]: 2025-11-25 06:23:36.945 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:38 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:38.175 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:23:38 compute-0 nova_compute[186241]: 2025-11-25 06:23:38.176 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:38 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:38.177 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:23:38 compute-0 nova_compute[186241]: 2025-11-25 06:23:38.347 186245 DEBUG nova.compute.manager [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-changed-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:23:38 compute-0 nova_compute[186241]: 2025-11-25 06:23:38.347 186245 DEBUG nova.compute.manager [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Refreshing instance network info cache due to event network-changed-17e014dc-1fe1-4091-95b3-3c08eb9abbb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:23:38 compute-0 nova_compute[186241]: 2025-11-25 06:23:38.348 186245 DEBUG oslo_concurrency.lockutils [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:23:38 compute-0 nova_compute[186241]: 2025-11-25 06:23:38.348 186245 DEBUG oslo_concurrency.lockutils [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:23:38 compute-0 nova_compute[186241]: 2025-11-25 06:23:38.348 186245 DEBUG nova.network.neutron [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Refreshing network info cache for port 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:23:39 compute-0 nova_compute[186241]: 2025-11-25 06:23:39.735 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:41 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:41.178 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:23:41 compute-0 nova_compute[186241]: 2025-11-25 06:23:41.947 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:42 compute-0 podman[213982]: 2025-11-25 06:23:42.061852349 +0000 UTC m=+0.039955616 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:23:42 compute-0 nova_compute[186241]: 2025-11-25 06:23:42.475 186245 DEBUG nova.network.neutron [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updated VIF entry in instance network info cache for port 17e014dc-1fe1-4091-95b3-3c08eb9abbb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:23:42 compute-0 nova_compute[186241]: 2025-11-25 06:23:42.475 186245 DEBUG nova.network.neutron [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updating instance_info_cache with network_info: [{"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:23:42 compute-0 nova_compute[186241]: 2025-11-25 06:23:42.978 186245 DEBUG oslo_concurrency.lockutils [req-123e0fd9-df00-41b8-b7e5-89e8c4351aaf req-3a6d9744-46f7-4b8e-b3f5-4268563d005d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-b79ea1d7-d6e1-430b-82bf-566447f159f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:23:44 compute-0 nova_compute[186241]: 2025-11-25 06:23:44.738 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:46 compute-0 nova_compute[186241]: 2025-11-25 06:23:46.948 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:47 compute-0 podman[213999]: 2025-11-25 06:23:47.05824491 +0000 UTC m=+0.037688764 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:23:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:47.417 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:47.417 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:23:47.418 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:49 compute-0 nova_compute[186241]: 2025-11-25 06:23:49.556 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:49 compute-0 nova_compute[186241]: 2025-11-25 06:23:49.557 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:49 compute-0 nova_compute[186241]: 2025-11-25 06:23:49.739 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:50 compute-0 nova_compute[186241]: 2025-11-25 06:23:50.060 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:23:50 compute-0 nova_compute[186241]: 2025-11-25 06:23:50.585 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:50 compute-0 nova_compute[186241]: 2025-11-25 06:23:50.586 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:50 compute-0 nova_compute[186241]: 2025-11-25 06:23:50.591 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:23:50 compute-0 nova_compute[186241]: 2025-11-25 06:23:50.592 186245 INFO nova.compute.claims [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:23:51 compute-0 nova_compute[186241]: 2025-11-25 06:23:51.640 186245 DEBUG nova.compute.provider_tree [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:23:51 compute-0 nova_compute[186241]: 2025-11-25 06:23:51.950 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:52 compute-0 nova_compute[186241]: 2025-11-25 06:23:52.144 186245 DEBUG nova.scheduler.client.report [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:23:52 compute-0 nova_compute[186241]: 2025-11-25 06:23:52.649 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:52 compute-0 nova_compute[186241]: 2025-11-25 06:23:52.650 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:23:53 compute-0 nova_compute[186241]: 2025-11-25 06:23:53.156 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:23:53 compute-0 nova_compute[186241]: 2025-11-25 06:23:53.156 186245 DEBUG nova.network.neutron [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:23:53 compute-0 nova_compute[186241]: 2025-11-25 06:23:53.453 186245 DEBUG nova.policy [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:23:53 compute-0 nova_compute[186241]: 2025-11-25 06:23:53.659 186245 INFO nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:23:54 compute-0 nova_compute[186241]: 2025-11-25 06:23:54.163 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:23:54 compute-0 nova_compute[186241]: 2025-11-25 06:23:54.743 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:55 compute-0 podman[214021]: 2025-11-25 06:23:55.082969108 +0000 UTC m=+0.059542631 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.147 186245 DEBUG nova.network.neutron [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Successfully created port: ca979869-5ec3-4219-ad54-05f1d95b74ba _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.172 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.172 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.173 186245 INFO nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Creating image(s)
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.173 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.173 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.174 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.174 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.177 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.178 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.222 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.223 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.223 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.223 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.227 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.227 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.270 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.271 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.296 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.298 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.298 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.343 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.344 186245 DEBUG nova.virt.disk.api [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.344 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.389 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.390 186245 DEBUG nova.virt.disk.api [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.391 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.391 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Ensure instance console log exists: /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.391 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.392 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:23:55 compute-0 nova_compute[186241]: 2025-11-25 06:23:55.392 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.247 186245 DEBUG nova.network.neutron [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Successfully updated port: ca979869-5ec3-4219-ad54-05f1d95b74ba _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.378 186245 DEBUG nova.compute.manager [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-changed-ca979869-5ec3-4219-ad54-05f1d95b74ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.379 186245 DEBUG nova.compute.manager [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Refreshing instance network info cache due to event network-changed-ca979869-5ec3-4219-ad54-05f1d95b74ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.379 186245 DEBUG oslo_concurrency.lockutils [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.379 186245 DEBUG oslo_concurrency.lockutils [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.379 186245 DEBUG nova.network.neutron [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Refreshing network info cache for port ca979869-5ec3-4219-ad54-05f1d95b74ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.750 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:23:56 compute-0 nova_compute[186241]: 2025-11-25 06:23:56.951 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:23:57 compute-0 nova_compute[186241]: 2025-11-25 06:23:57.211 186245 DEBUG nova.network.neutron [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:23:57 compute-0 nova_compute[186241]: 2025-11-25 06:23:57.753 186245 DEBUG nova.network.neutron [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:23:58 compute-0 nova_compute[186241]: 2025-11-25 06:23:58.257 186245 DEBUG oslo_concurrency.lockutils [req-43a9adae-b196-4bc8-b412-3ed9243628a2 req-de7c97f7-d6a0-4dac-8162-49e52bc84506 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:23:58 compute-0 nova_compute[186241]: 2025-11-25 06:23:58.257 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:23:58 compute-0 nova_compute[186241]: 2025-11-25 06:23:58.258 186245 DEBUG nova.network.neutron [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:23:59 compute-0 nova_compute[186241]: 2025-11-25 06:23:59.037 186245 DEBUG nova.network.neutron [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:23:59 compute-0 podman[214060]: 2025-11-25 06:23:59.0768093 +0000 UTC m=+0.048663253 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:23:59 compute-0 podman[214061]: 2025-11-25 06:23:59.097016354 +0000 UTC m=+0.066372533 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:23:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:23:59.549 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:23:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:23:59.552 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/b79ea1d7-d6e1-430b-82bf-566447f159f3 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e471cc3fc7ae9ac5d8fd794e8aefa20e5f5c77c3e3edccb41964d2d46a7818d3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Nov 25 06:23:59 compute-0 nova_compute[186241]: 2025-11-25 06:23:59.746 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.229 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1853 Content-Type: application/json Date: Tue, 25 Nov 2025 06:23:59 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-26caae96-1d0d-4bad-ad0c-140399db9599 x-openstack-request-id: req-26caae96-1d0d-4bad-ad0c-140399db9599 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.230 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "b79ea1d7-d6e1-430b-82bf-566447f159f3", "name": "tempest-TestNetworkBasicOps-server-112305377", "status": "ACTIVE", "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "user_id": "66a05d0ca82146a5a458244c8e5364de", "metadata": {}, "hostId": "d6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5", "image": {"id": "5215c26e-be2f-40b4-ac47-476bfa3cf3f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/5215c26e-be2f-40b4-ac47-476bfa3cf3f2"}]}, "flavor": {"id": "53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac"}]}, "created": "2025-11-25T06:23:00Z", "updated": "2025-11-25T06:23:18Z", "addresses": {"tempest-network-smoke--1881386856": [{"version": 4, "addr": "10.100.0.3", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:44:4d:aa"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/b79ea1d7-d6e1-430b-82bf-566447f159f3"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/b79ea1d7-d6e1-430b-82bf-566447f159f3"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-2120882882", "OS-SRV-USG:launched_at": "2025-11-25T06:23:18.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-1984361748"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000004", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.230 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/b79ea1d7-d6e1-430b-82bf-566447f159f3 used request id req-26caae96-1d0d-4bad-ad0c-140399db9599 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.230 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b79ea1d7-d6e1-430b-82bf-566447f159f3', 'name': 'tempest-TestNetworkBasicOps-server-112305377', 'flavor': {'id': '53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd90b557db9104ecfb816b1cdab8712bd', 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'hostId': 'd6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.231 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.231 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.231 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.231 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.231 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2025-11-25T06:24:01.231491) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.239 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.239 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.239 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.240 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.240 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.240 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.240 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.240 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.240 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2025-11-25T06:24:01.240302) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.250 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/memory.usage volume: 42.515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.250 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.250 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.250 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.250 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.250 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.250 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.251 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2025-11-25T06:24:01.250958) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.252 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b79ea1d7-d6e1-430b-82bf-566447f159f3 / tap17e014dc-1f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.252 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.outgoing.packets volume: 109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.252 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.252 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.252 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.252 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.252 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2025-11-25T06:24:01.253039) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.253 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2025-11-25T06:24:01.253798) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.254 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2025-11-25T06:24:01.254652) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.270 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.271 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.read.latency volume: 269084507 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.read.latency volume: 24073774 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2025-11-25T06:24:01.271749) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.272 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.incoming.packets volume: 102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2025-11-25T06:24:01.272845) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.273 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2025-11-25T06:24:01.273680) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.write.latency volume: 367888560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2025-11-25T06:24:01.274551) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.274 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.read.requests volume: 1081 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2025-11-25T06:24:01.275594) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.275 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.276 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.276 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.276 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.276 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.276 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.276 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.276 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2025-11-25T06:24:01.276630) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.277 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2025-11-25T06:24:01.277347) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2025-11-25T06:24:01.278193) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-112305377>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-112305377>]
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.278 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.read.bytes volume: 30050816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2025-11-25T06:24:01.278981) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.279 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/cpu volume: 9680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2025-11-25T06:24:01.280094) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.280 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2025-11-25T06:24:01.280896) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.281 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.incoming.bytes volume: 19674 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2025-11-25T06:24:01.281705) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.282 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2025-11-25T06:24:01.282559) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2025-11-25T06:24:01.283611) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.283 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.write.bytes volume: 73011200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.284 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2025-11-25T06:24:01.284650) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.285 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2025-11-25T06:24:01.285722) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2025-11-25T06:24:01.286565) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-112305377>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-112305377>]
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.286 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2025-11-25T06:24:01.287243) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.287 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2025-11-25T06:24:01.288046) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.288 16 DEBUG ceilometer.compute.pollsters [-] b79ea1d7-d6e1-430b-82bf-566447f159f3/network.outgoing.bytes volume: 16018 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.289 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:24:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:24:01.289 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2025-11-25T06:24:01.288850) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:24:01 compute-0 nova_compute[186241]: 2025-11-25 06:24:01.952 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:02 compute-0 podman[214099]: 2025-11-25 06:24:02.064057628 +0000 UTC m=+0.035918423 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.214 186245 DEBUG nova.network.neutron [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Updating instance_info_cache with network_info: [{"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.716 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.717 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Instance network_info: |[{"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.718 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Start _get_guest_xml network_info=[{"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.722 186245 WARNING nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.722 186245 DEBUG nova.virt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-690829021', uuid='b546e0a0-551c-4e33-a7a4-092c7b149ce6'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051843.7227814) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.728 186245 DEBUG nova.virt.libvirt.host [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.728 186245 DEBUG nova.virt.libvirt.host [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.731 186245 DEBUG nova.virt.libvirt.host [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.732 186245 DEBUG nova.virt.libvirt.host [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.732 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.732 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.733 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.733 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.733 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.733 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.733 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.734 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.734 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.734 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.734 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.734 186245 DEBUG nova.virt.hardware [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.737 186245 DEBUG nova.virt.libvirt.vif [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:23:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-690829021',display_name='tempest-TestNetworkBasicOps-server-690829021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-690829021',id=5,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNA2WsBLwfQAsG9CmIOk0KoKBTuxsZtfdEV+EgmrTAW6GfhnPk6z2gQ6xBYqp8bFm1jfZTNH6XW8DzM6xOScAqgb9VZIE6Vh6b0RNqPsMoMkKTtmI5h+jVTZpqcu7L4+7w==',key_name='tempest-TestNetworkBasicOps-1645687792',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-v507jr95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:23:54Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=b546e0a0-551c-4e33-a7a4-092c7b149ce6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.737 186245 DEBUG nova.network.os_vif_util [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.738 186245 DEBUG nova.network.os_vif_util [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:cb:82,bridge_name='br-int',has_traffic_filtering=True,id=ca979869-5ec3-4219-ad54-05f1d95b74ba,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca979869-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:24:03 compute-0 nova_compute[186241]: 2025-11-25 06:24:03.738 186245 DEBUG nova.objects.instance [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid b546e0a0-551c-4e33-a7a4-092c7b149ce6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.243 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <uuid>b546e0a0-551c-4e33-a7a4-092c7b149ce6</uuid>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <name>instance-00000005</name>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-690829021</nova:name>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:24:03</nova:creationTime>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         <nova:port uuid="ca979869-5ec3-4219-ad54-05f1d95b74ba">
Nov 25 06:24:04 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <system>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <entry name="serial">b546e0a0-551c-4e33-a7a4-092c7b149ce6</entry>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <entry name="uuid">b546e0a0-551c-4e33-a7a4-092c7b149ce6</entry>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </system>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <os>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   </os>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <features>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   </features>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk.config"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:66:cb:82"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <target dev="tapca979869-5e"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/console.log" append="off"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <video>
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </video>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:24:04 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:24:04 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:24:04 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:24:04 compute-0 nova_compute[186241]: </domain>
Nov 25 06:24:04 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.244 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Preparing to wait for external event network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.244 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.244 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.244 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.245 186245 DEBUG nova.virt.libvirt.vif [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:23:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-690829021',display_name='tempest-TestNetworkBasicOps-server-690829021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-690829021',id=5,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNA2WsBLwfQAsG9CmIOk0KoKBTuxsZtfdEV+EgmrTAW6GfhnPk6z2gQ6xBYqp8bFm1jfZTNH6XW8DzM6xOScAqgb9VZIE6Vh6b0RNqPsMoMkKTtmI5h+jVTZpqcu7L4+7w==',key_name='tempest-TestNetworkBasicOps-1645687792',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-v507jr95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:23:54Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=b546e0a0-551c-4e33-a7a4-092c7b149ce6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.245 186245 DEBUG nova.network.os_vif_util [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.246 186245 DEBUG nova.network.os_vif_util [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:cb:82,bridge_name='br-int',has_traffic_filtering=True,id=ca979869-5ec3-4219-ad54-05f1d95b74ba,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca979869-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.246 186245 DEBUG os_vif [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:cb:82,bridge_name='br-int',has_traffic_filtering=True,id=ca979869-5ec3-4219-ad54-05f1d95b74ba,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca979869-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.246 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.247 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.247 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.248 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.248 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'cccae60e-4e59-5697-8de8-72b87afbf6e4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.249 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.250 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.252 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.253 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca979869-5e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.253 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapca979869-5e, col_values=(('qos', UUID('6b3e19ab-d3d6-4464-928d-67e4dc0841b4')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.253 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapca979869-5e, col_values=(('external_ids', {'iface-id': 'ca979869-5ec3-4219-ad54-05f1d95b74ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:cb:82', 'vm-uuid': 'b546e0a0-551c-4e33-a7a4-092c7b149ce6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.254 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:04 compute-0 NetworkManager[55345]: <info>  [1764051844.2552] manager: (tapca979869-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.256 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.260 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:04 compute-0 nova_compute[186241]: 2025-11-25 06:24:04.261 186245 INFO os_vif [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:cb:82,bridge_name='br-int',has_traffic_filtering=True,id=ca979869-5ec3-4219-ad54-05f1d95b74ba,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca979869-5e')
Nov 25 06:24:05 compute-0 nova_compute[186241]: 2025-11-25 06:24:05.781 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:24:05 compute-0 nova_compute[186241]: 2025-11-25 06:24:05.782 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:24:05 compute-0 nova_compute[186241]: 2025-11-25 06:24:05.783 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:66:cb:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:24:05 compute-0 nova_compute[186241]: 2025-11-25 06:24:05.783 186245 INFO nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Using config drive
Nov 25 06:24:06 compute-0 nova_compute[186241]: 2025-11-25 06:24:06.791 186245 INFO nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Creating config drive at /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk.config
Nov 25 06:24:06 compute-0 nova_compute[186241]: 2025-11-25 06:24:06.796 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpmkjqupqb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:24:06 compute-0 nova_compute[186241]: 2025-11-25 06:24:06.914 186245 DEBUG oslo_concurrency.processutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpmkjqupqb" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:24:06 compute-0 nova_compute[186241]: 2025-11-25 06:24:06.954 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:06 compute-0 kernel: tapca979869-5e: entered promiscuous mode
Nov 25 06:24:06 compute-0 NetworkManager[55345]: <info>  [1764051846.9596] manager: (tapca979869-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Nov 25 06:24:06 compute-0 ovn_controller[95135]: 2025-11-25T06:24:06Z|00086|binding|INFO|Claiming lport ca979869-5ec3-4219-ad54-05f1d95b74ba for this chassis.
Nov 25 06:24:06 compute-0 ovn_controller[95135]: 2025-11-25T06:24:06Z|00087|binding|INFO|ca979869-5ec3-4219-ad54-05f1d95b74ba: Claiming fa:16:3e:66:cb:82 10.100.0.11
Nov 25 06:24:06 compute-0 nova_compute[186241]: 2025-11-25 06:24:06.961 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:06 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:06.970 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:cb:82 10.100.0.11'], port_security=['fa:16:3e:66:cb:82 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b546e0a0-551c-4e33-a7a4-092c7b149ce6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5aa8e2df-a8af-44d6-8e00-dab01a7a4c94', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5f4b8f2-91df-4e6d-a0cc-17ea8a984247, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=ca979869-5ec3-4219-ad54-05f1d95b74ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:24:06 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:06.971 103953 INFO neutron.agent.ovn.metadata.agent [-] Port ca979869-5ec3-4219-ad54-05f1d95b74ba in datapath f088e83e-6869-485f-aff5-47d816c267b4 bound to our chassis
Nov 25 06:24:06 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:06.972 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f088e83e-6869-485f-aff5-47d816c267b4
Nov 25 06:24:06 compute-0 ovn_controller[95135]: 2025-11-25T06:24:06Z|00088|binding|INFO|Setting lport ca979869-5ec3-4219-ad54-05f1d95b74ba ovn-installed in OVS
Nov 25 06:24:06 compute-0 ovn_controller[95135]: 2025-11-25T06:24:06Z|00089|binding|INFO|Setting lport ca979869-5ec3-4219-ad54-05f1d95b74ba up in Southbound
Nov 25 06:24:06 compute-0 nova_compute[186241]: 2025-11-25 06:24:06.974 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:06 compute-0 nova_compute[186241]: 2025-11-25 06:24:06.977 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:06 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:06.985 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7fa0d5-f1a4-48bc-9384-a405d10b3d4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:06 compute-0 systemd-udevd[214143]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:24:07 compute-0 NetworkManager[55345]: <info>  [1764051847.0015] device (tapca979869-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:24:07 compute-0 NetworkManager[55345]: <info>  [1764051847.0019] device (tapca979869-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:24:07 compute-0 systemd-machined[152921]: New machine qemu-5-instance-00000005.
Nov 25 06:24:07 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.009 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[9175ad3d-ff06-4664-ab51-468cfbba38b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.012 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[4a59eb69-536c-4708-8ec8-b796afe79e05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.033 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[b724e137-3ae2-4bb9-bb5e-99a4623a41a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.047 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ca542805-7810-4178-aac4-d963f3825bb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf088e83e-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:19:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 278613, 'reachable_time': 37864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214162, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.060 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[33e13c1e-496b-45d0-933f-8f2aa15d7852]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf088e83e-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 278620, 'tstamp': 278620}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214167, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf088e83e-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 278622, 'tstamp': 278622}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214167, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.061 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf088e83e-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.062 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.063 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.064 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf088e83e-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.065 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.065 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf088e83e-60, col_values=(('external_ids', {'iface-id': 'e4302f31-94aa-43d3-9e59-7579149e9537'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.065 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:24:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:07.066 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[08081eb4-826b-4c57-8168-916000f25b7b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f088e83e-6869-485f-aff5-47d816c267b4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f088e83e-6869-485f-aff5-47d816c267b4\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:07 compute-0 podman[214127]: 2025-11-25 06:24:07.072290254 +0000 UTC m=+0.116638914 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6)
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.398 186245 DEBUG nova.compute.manager [req-87a0bc3a-1df5-4baf-901c-a52d1d72574a req-c2e479f9-8f51-4b60-9fe9-2cd3e245a2e7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.399 186245 DEBUG oslo_concurrency.lockutils [req-87a0bc3a-1df5-4baf-901c-a52d1d72574a req-c2e479f9-8f51-4b60-9fe9-2cd3e245a2e7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.399 186245 DEBUG oslo_concurrency.lockutils [req-87a0bc3a-1df5-4baf-901c-a52d1d72574a req-c2e479f9-8f51-4b60-9fe9-2cd3e245a2e7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.400 186245 DEBUG oslo_concurrency.lockutils [req-87a0bc3a-1df5-4baf-901c-a52d1d72574a req-c2e479f9-8f51-4b60-9fe9-2cd3e245a2e7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.400 186245 DEBUG nova.compute.manager [req-87a0bc3a-1df5-4baf-901c-a52d1d72574a req-c2e479f9-8f51-4b60-9fe9-2cd3e245a2e7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Processing event network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.802 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.805 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.808 186245 INFO nova.virt.libvirt.driver [-] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Instance spawned successfully.
Nov 25 06:24:07 compute-0 nova_compute[186241]: 2025-11-25 06:24:07.808 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.316 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.317 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.317 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.317 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.318 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.318 186245 DEBUG nova.virt.libvirt.driver [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.824 186245 INFO nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Took 13.65 seconds to spawn the instance on the hypervisor.
Nov 25 06:24:08 compute-0 nova_compute[186241]: 2025-11-25 06:24:08.825 186245 DEBUG nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.256 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.337 186245 INFO nova.compute.manager [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Took 18.77 seconds to build instance.
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.546 186245 DEBUG nova.compute.manager [req-1cf0b6da-a44c-48e1-ab3c-979072698769 req-0d781b53-bf54-4b28-a192-0224e1340d3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.547 186245 DEBUG oslo_concurrency.lockutils [req-1cf0b6da-a44c-48e1-ab3c-979072698769 req-0d781b53-bf54-4b28-a192-0224e1340d3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.547 186245 DEBUG oslo_concurrency.lockutils [req-1cf0b6da-a44c-48e1-ab3c-979072698769 req-0d781b53-bf54-4b28-a192-0224e1340d3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.547 186245 DEBUG oslo_concurrency.lockutils [req-1cf0b6da-a44c-48e1-ab3c-979072698769 req-0d781b53-bf54-4b28-a192-0224e1340d3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.547 186245 DEBUG nova.compute.manager [req-1cf0b6da-a44c-48e1-ab3c-979072698769 req-0d781b53-bf54-4b28-a192-0224e1340d3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] No waiting events found dispatching network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.548 186245 WARNING nova.compute.manager [req-1cf0b6da-a44c-48e1-ab3c-979072698769 req-0d781b53-bf54-4b28-a192-0224e1340d3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received unexpected event network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba for instance with vm_state active and task_state None.
Nov 25 06:24:09 compute-0 nova_compute[186241]: 2025-11-25 06:24:09.839 186245 DEBUG oslo_concurrency.lockutils [None req-ff83afbb-9c6e-4b3a-9f7e-bef792100e42 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:11 compute-0 nova_compute[186241]: 2025-11-25 06:24:11.956 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:12 compute-0 nova_compute[186241]: 2025-11-25 06:24:12.690 186245 DEBUG nova.compute.manager [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-changed-ca979869-5ec3-4219-ad54-05f1d95b74ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:12 compute-0 nova_compute[186241]: 2025-11-25 06:24:12.691 186245 DEBUG nova.compute.manager [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Refreshing instance network info cache due to event network-changed-ca979869-5ec3-4219-ad54-05f1d95b74ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:24:12 compute-0 nova_compute[186241]: 2025-11-25 06:24:12.691 186245 DEBUG oslo_concurrency.lockutils [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:24:12 compute-0 nova_compute[186241]: 2025-11-25 06:24:12.692 186245 DEBUG oslo_concurrency.lockutils [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:24:12 compute-0 nova_compute[186241]: 2025-11-25 06:24:12.692 186245 DEBUG nova.network.neutron [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Refreshing network info cache for port ca979869-5ec3-4219-ad54-05f1d95b74ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:24:13 compute-0 podman[214176]: 2025-11-25 06:24:13.069935082 +0000 UTC m=+0.044310205 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 06:24:14 compute-0 nova_compute[186241]: 2025-11-25 06:24:14.258 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:16 compute-0 nova_compute[186241]: 2025-11-25 06:24:16.215 186245 DEBUG nova.network.neutron [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Updated VIF entry in instance network info cache for port ca979869-5ec3-4219-ad54-05f1d95b74ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:24:16 compute-0 nova_compute[186241]: 2025-11-25 06:24:16.216 186245 DEBUG nova.network.neutron [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Updating instance_info_cache with network_info: [{"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:24:16 compute-0 nova_compute[186241]: 2025-11-25 06:24:16.719 186245 DEBUG oslo_concurrency.lockutils [req-0aa09278-4f6c-4083-8256-c318b0e4fb46 req-5ba36a6b-3ac8-48fe-8fc9-199390a5e664 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-b546e0a0-551c-4e33-a7a4-092c7b149ce6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:24:16 compute-0 nova_compute[186241]: 2025-11-25 06:24:16.957 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:18 compute-0 podman[214204]: 2025-11-25 06:24:18.056119418 +0000 UTC m=+0.035637623 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:24:18 compute-0 ovn_controller[95135]: 2025-11-25T06:24:18Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:66:cb:82 10.100.0.11
Nov 25 06:24:18 compute-0 ovn_controller[95135]: 2025-11-25T06:24:18Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:66:cb:82 10.100.0.11
Nov 25 06:24:18 compute-0 nova_compute[186241]: 2025-11-25 06:24:18.933 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:19 compute-0 nova_compute[186241]: 2025-11-25 06:24:19.262 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:19 compute-0 nova_compute[186241]: 2025-11-25 06:24:19.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:21 compute-0 nova_compute[186241]: 2025-11-25 06:24:21.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:21 compute-0 nova_compute[186241]: 2025-11-25 06:24:21.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:21 compute-0 nova_compute[186241]: 2025-11-25 06:24:21.959 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:22 compute-0 nova_compute[186241]: 2025-11-25 06:24:22.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:22 compute-0 nova_compute[186241]: 2025-11-25 06:24:22.442 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:22 compute-0 nova_compute[186241]: 2025-11-25 06:24:22.442 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:22 compute-0 nova_compute[186241]: 2025-11-25 06:24:22.442 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.466 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.512 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.513 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.556 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.560 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.604 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.605 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.660 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.863 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.864 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5467MB free_disk=72.96375274658203GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.864 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:23 compute-0 nova_compute[186241]: 2025-11-25 06:24:23.865 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.265 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.900 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance b79ea1d7-d6e1-430b-82bf-566447f159f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.900 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance b546e0a0-551c-4e33-a7a4-092c7b149ce6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.900 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.901 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.914 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing inventories for resource provider b9b31722-b833-4ea1-a013-247935742e36 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.926 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating ProviderTree inventory for provider b9b31722-b833-4ea1-a013-247935742e36 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.926 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.938 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing aggregate associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.953 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing trait associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX512VPCLMULQDQ,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_ARCH_X86_64,HW_CPU_X86_AMD_SVM,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX512VAES,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX2,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.988 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.995 186245 INFO nova.compute.manager [None req-f23db79a-a68e-4f83-9442-c00b264e2edc 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Get console output
Nov 25 06:24:24 compute-0 nova_compute[186241]: 2025-11-25 06:24:24.998 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:24:25 compute-0 nova_compute[186241]: 2025-11-25 06:24:25.491 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:24:25 compute-0 nova_compute[186241]: 2025-11-25 06:24:25.996 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:24:25 compute-0 nova_compute[186241]: 2025-11-25 06:24:25.997 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:26 compute-0 podman[214239]: 2025-11-25 06:24:26.074697518 +0000 UTC m=+0.054746810 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.285 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.286 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.286 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.286 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.286 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.287 186245 INFO nova.compute.manager [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Terminating instance
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.791 186245 DEBUG nova.compute.manager [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:24:26 compute-0 kernel: tapca979869-5e (unregistering): left promiscuous mode
Nov 25 06:24:26 compute-0 NetworkManager[55345]: <info>  [1764051866.8133] device (tapca979869-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:24:26 compute-0 ovn_controller[95135]: 2025-11-25T06:24:26Z|00090|binding|INFO|Releasing lport ca979869-5ec3-4219-ad54-05f1d95b74ba from this chassis (sb_readonly=0)
Nov 25 06:24:26 compute-0 ovn_controller[95135]: 2025-11-25T06:24:26Z|00091|binding|INFO|Setting lport ca979869-5ec3-4219-ad54-05f1d95b74ba down in Southbound
Nov 25 06:24:26 compute-0 ovn_controller[95135]: 2025-11-25T06:24:26Z|00092|binding|INFO|Removing iface tapca979869-5e ovn-installed in OVS
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.822 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.825 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:cb:82 10.100.0.11'], port_security=['fa:16:3e:66:cb:82 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b546e0a0-551c-4e33-a7a4-092c7b149ce6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5aa8e2df-a8af-44d6-8e00-dab01a7a4c94', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5f4b8f2-91df-4e6d-a0cc-17ea8a984247, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=ca979869-5ec3-4219-ad54-05f1d95b74ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.830 103953 INFO neutron.agent.ovn.metadata.agent [-] Port ca979869-5ec3-4219-ad54-05f1d95b74ba in datapath f088e83e-6869-485f-aff5-47d816c267b4 unbound from our chassis
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.831 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f088e83e-6869-485f-aff5-47d816c267b4
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.836 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.845 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3f38ab1d-0f98-4e11-8edc-5953a918f81f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:26 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 25 06:24:26 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 11.958s CPU time.
Nov 25 06:24:26 compute-0 systemd-machined[152921]: Machine qemu-5-instance-00000005 terminated.
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.862 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[0d31b0a9-3d25-45a7-93e5-9f0c3b3c3a40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.864 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[70272f51-9820-4415-aad1-76201371c732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.878 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[d44b4c65-2e78-4867-a699-77ecc413d15b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.891 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[07b9d713-5e20-4ab8-a8f8-6b8a32994489]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf088e83e-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:19:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 278613, 'reachable_time': 37864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214274, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.901 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbc4871-4702-46b2-824a-7df5977c4b91]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf088e83e-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 278620, 'tstamp': 278620}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214275, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf088e83e-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 278622, 'tstamp': 278622}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214275, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.902 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf088e83e-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.903 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.906 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.906 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf088e83e-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.906 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.907 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf088e83e-60, col_values=(('external_ids', {'iface-id': 'e4302f31-94aa-43d3-9e59-7579149e9537'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.907 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:24:26 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:26.908 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9987aa1f-51e5-4024-817a-3f4743e19c9d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f088e83e-6869-485f-aff5-47d816c267b4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f088e83e-6869-485f-aff5-47d816c267b4\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.959 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.994 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.995 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.996 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.996 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:24:26 compute-0 nova_compute[186241]: 2025-11-25 06:24:26.997 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.028 186245 DEBUG nova.compute.manager [req-f87d19fa-c02c-4a36-a71c-88fea9e99995 req-8e77c7d8-e5d7-4c8f-bb85-bb02a305bf67 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-vif-unplugged-ca979869-5ec3-4219-ad54-05f1d95b74ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.028 186245 DEBUG oslo_concurrency.lockutils [req-f87d19fa-c02c-4a36-a71c-88fea9e99995 req-8e77c7d8-e5d7-4c8f-bb85-bb02a305bf67 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.029 186245 DEBUG oslo_concurrency.lockutils [req-f87d19fa-c02c-4a36-a71c-88fea9e99995 req-8e77c7d8-e5d7-4c8f-bb85-bb02a305bf67 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.029 186245 DEBUG oslo_concurrency.lockutils [req-f87d19fa-c02c-4a36-a71c-88fea9e99995 req-8e77c7d8-e5d7-4c8f-bb85-bb02a305bf67 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.029 186245 DEBUG nova.compute.manager [req-f87d19fa-c02c-4a36-a71c-88fea9e99995 req-8e77c7d8-e5d7-4c8f-bb85-bb02a305bf67 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] No waiting events found dispatching network-vif-unplugged-ca979869-5ec3-4219-ad54-05f1d95b74ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.029 186245 DEBUG nova.compute.manager [req-f87d19fa-c02c-4a36-a71c-88fea9e99995 req-8e77c7d8-e5d7-4c8f-bb85-bb02a305bf67 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-vif-unplugged-ca979869-5ec3-4219-ad54-05f1d95b74ba for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.032 186245 INFO nova.virt.libvirt.driver [-] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Instance destroyed successfully.
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.032 186245 DEBUG nova.objects.instance [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid b546e0a0-551c-4e33-a7a4-092c7b149ce6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.534 186245 DEBUG nova.virt.libvirt.vif [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:23:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-690829021',display_name='tempest-TestNetworkBasicOps-server-690829021',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-690829021',id=5,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNA2WsBLwfQAsG9CmIOk0KoKBTuxsZtfdEV+EgmrTAW6GfhnPk6z2gQ6xBYqp8bFm1jfZTNH6XW8DzM6xOScAqgb9VZIE6Vh6b0RNqPsMoMkKTtmI5h+jVTZpqcu7L4+7w==',key_name='tempest-TestNetworkBasicOps-1645687792',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:24:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-v507jr95',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:24:08Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=b546e0a0-551c-4e33-a7a4-092c7b149ce6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.535 186245 DEBUG nova.network.os_vif_util [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "address": "fa:16:3e:66:cb:82", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca979869-5e", "ovs_interfaceid": "ca979869-5ec3-4219-ad54-05f1d95b74ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.536 186245 DEBUG nova.network.os_vif_util [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:cb:82,bridge_name='br-int',has_traffic_filtering=True,id=ca979869-5ec3-4219-ad54-05f1d95b74ba,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca979869-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.536 186245 DEBUG os_vif [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:cb:82,bridge_name='br-int',has_traffic_filtering=True,id=ca979869-5ec3-4219-ad54-05f1d95b74ba,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca979869-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.537 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.537 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca979869-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.538 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.541 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.542 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.542 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=6b3e19ab-d3d6-4464-928d-67e4dc0841b4) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.543 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.545 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.547 186245 INFO os_vif [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:cb:82,bridge_name='br-int',has_traffic_filtering=True,id=ca979869-5ec3-4219-ad54-05f1d95b74ba,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca979869-5e')
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.547 186245 INFO nova.virt.libvirt.driver [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Deleting instance files /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6_del
Nov 25 06:24:27 compute-0 nova_compute[186241]: 2025-11-25 06:24:27.548 186245 INFO nova.virt.libvirt.driver [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Deletion of /var/lib/nova/instances/b546e0a0-551c-4e33-a7a4-092c7b149ce6_del complete
Nov 25 06:24:28 compute-0 nova_compute[186241]: 2025-11-25 06:24:28.055 186245 INFO nova.compute.manager [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:24:28 compute-0 nova_compute[186241]: 2025-11-25 06:24:28.055 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:24:28 compute-0 nova_compute[186241]: 2025-11-25 06:24:28.055 186245 DEBUG nova.compute.manager [-] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:24:28 compute-0 nova_compute[186241]: 2025-11-25 06:24:28.056 186245 DEBUG nova.network.neutron [-] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.093 186245 DEBUG nova.compute.manager [req-3afe5843-3609-4511-9216-b60cb8b651a4 req-c3b00dcb-d0ae-487c-964f-230c20949c25 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-vif-deleted-ca979869-5ec3-4219-ad54-05f1d95b74ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.094 186245 INFO nova.compute.manager [req-3afe5843-3609-4511-9216-b60cb8b651a4 req-c3b00dcb-d0ae-487c-964f-230c20949c25 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Neutron deleted interface ca979869-5ec3-4219-ad54-05f1d95b74ba; detaching it from the instance and deleting it from the info cache
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.094 186245 DEBUG nova.network.neutron [req-3afe5843-3609-4511-9216-b60cb8b651a4 req-c3b00dcb-d0ae-487c-964f-230c20949c25 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.241 186245 DEBUG nova.compute.manager [req-0d68d9b9-ffc6-4220-9b2b-d20e90924dba req-657be142-8915-4e2a-8680-764d3d8c9c93 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received event network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.241 186245 DEBUG oslo_concurrency.lockutils [req-0d68d9b9-ffc6-4220-9b2b-d20e90924dba req-657be142-8915-4e2a-8680-764d3d8c9c93 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.242 186245 DEBUG oslo_concurrency.lockutils [req-0d68d9b9-ffc6-4220-9b2b-d20e90924dba req-657be142-8915-4e2a-8680-764d3d8c9c93 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.242 186245 DEBUG oslo_concurrency.lockutils [req-0d68d9b9-ffc6-4220-9b2b-d20e90924dba req-657be142-8915-4e2a-8680-764d3d8c9c93 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.242 186245 DEBUG nova.compute.manager [req-0d68d9b9-ffc6-4220-9b2b-d20e90924dba req-657be142-8915-4e2a-8680-764d3d8c9c93 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] No waiting events found dispatching network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.242 186245 WARNING nova.compute.manager [req-0d68d9b9-ffc6-4220-9b2b-d20e90924dba req-657be142-8915-4e2a-8680-764d3d8c9c93 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Received unexpected event network-vif-plugged-ca979869-5ec3-4219-ad54-05f1d95b74ba for instance with vm_state active and task_state deleting.
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.427 186245 DEBUG nova.network.neutron [-] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.598 186245 DEBUG nova.compute.manager [req-3afe5843-3609-4511-9216-b60cb8b651a4 req-c3b00dcb-d0ae-487c-964f-230c20949c25 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Detach interface failed, port_id=ca979869-5ec3-4219-ad54-05f1d95b74ba, reason: Instance b546e0a0-551c-4e33-a7a4-092c7b149ce6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:24:29 compute-0 nova_compute[186241]: 2025-11-25 06:24:29.931 186245 INFO nova.compute.manager [-] [instance: b546e0a0-551c-4e33-a7a4-092c7b149ce6] Took 1.88 seconds to deallocate network for instance.
Nov 25 06:24:30 compute-0 podman[214295]: 2025-11-25 06:24:30.060987261 +0000 UTC m=+0.038739675 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:24:30 compute-0 podman[214294]: 2025-11-25 06:24:30.064882056 +0000 UTC m=+0.043240070 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=multipathd)
Nov 25 06:24:30 compute-0 nova_compute[186241]: 2025-11-25 06:24:30.436 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:30 compute-0 nova_compute[186241]: 2025-11-25 06:24:30.437 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:30 compute-0 nova_compute[186241]: 2025-11-25 06:24:30.483 186245 DEBUG nova.compute.provider_tree [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:24:30 compute-0 nova_compute[186241]: 2025-11-25 06:24:30.987 186245 DEBUG nova.scheduler.client.report [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:24:31 compute-0 nova_compute[186241]: 2025-11-25 06:24:31.492 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:31 compute-0 nova_compute[186241]: 2025-11-25 06:24:31.509 186245 INFO nova.scheduler.client.report [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance b546e0a0-551c-4e33-a7a4-092c7b149ce6
Nov 25 06:24:31 compute-0 nova_compute[186241]: 2025-11-25 06:24:31.961 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:32 compute-0 nova_compute[186241]: 2025-11-25 06:24:32.517 186245 DEBUG oslo_concurrency.lockutils [None req-1ec43601-1e9b-421e-be8a-d8bd89c2085f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b546e0a0-551c-4e33-a7a4-092c7b149ce6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:32 compute-0 nova_compute[186241]: 2025-11-25 06:24:32.543 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:33 compute-0 podman[214333]: 2025-11-25 06:24:33.060015973 +0000 UTC m=+0.039838655 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.066 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.066 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.066 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.067 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.067 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.067 186245 INFO nova.compute.manager [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Terminating instance
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.571 186245 DEBUG nova.compute.manager [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:24:35 compute-0 kernel: tap17e014dc-1f (unregistering): left promiscuous mode
Nov 25 06:24:35 compute-0 NetworkManager[55345]: <info>  [1764051875.5939] device (tap17e014dc-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.597 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:35 compute-0 ovn_controller[95135]: 2025-11-25T06:24:35Z|00093|binding|INFO|Releasing lport 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 from this chassis (sb_readonly=0)
Nov 25 06:24:35 compute-0 ovn_controller[95135]: 2025-11-25T06:24:35Z|00094|binding|INFO|Setting lport 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 down in Southbound
Nov 25 06:24:35 compute-0 ovn_controller[95135]: 2025-11-25T06:24:35Z|00095|binding|INFO|Removing iface tap17e014dc-1f ovn-installed in OVS
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.599 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.603 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:4d:aa 10.100.0.3'], port_security=['fa:16:3e:44:4d:aa 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b79ea1d7-d6e1-430b-82bf-566447f159f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f088e83e-6869-485f-aff5-47d816c267b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '90a196c4-6984-431f-afc3-bb9d2e72304f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5f4b8f2-91df-4e6d-a0cc-17ea8a984247, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=17e014dc-1fe1-4091-95b3-3c08eb9abbb2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.606 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 17e014dc-1fe1-4091-95b3-3c08eb9abbb2 in datapath f088e83e-6869-485f-aff5-47d816c267b4 unbound from our chassis
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.607 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f088e83e-6869-485f-aff5-47d816c267b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.607 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[59fe87a6-3a59-472d-9db6-05c6bbbaa30b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.608 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4 namespace which is not needed anymore
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.617 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:35 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 25 06:24:35 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 11.705s CPU time.
Nov 25 06:24:35 compute-0 systemd-machined[152921]: Machine qemu-4-instance-00000004 terminated.
Nov 25 06:24:35 compute-0 neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4[213852]: [NOTICE]   (213856) : haproxy version is 2.8.14-c23fe91
Nov 25 06:24:35 compute-0 neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4[213852]: [NOTICE]   (213856) : path to executable is /usr/sbin/haproxy
Nov 25 06:24:35 compute-0 neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4[213852]: [WARNING]  (213856) : Exiting Master process...
Nov 25 06:24:35 compute-0 podman[214371]: 2025-11-25 06:24:35.695210267 +0000 UTC m=+0.023412071 container kill 623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 06:24:35 compute-0 neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4[213852]: [ALERT]    (213856) : Current worker (213858) exited with code 143 (Terminated)
Nov 25 06:24:35 compute-0 neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4[213852]: [WARNING]  (213856) : All workers exited. Exiting... (0)
Nov 25 06:24:35 compute-0 systemd[1]: libpod-623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8.scope: Deactivated successfully.
Nov 25 06:24:35 compute-0 podman[214384]: 2025-11-25 06:24:35.725639752 +0000 UTC m=+0.016438386 container died 623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:24:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8-userdata-shm.mount: Deactivated successfully.
Nov 25 06:24:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-16c10de93db04b69bc31468fbdf4d084f6cea2f537dfc210e64f6c334d764c62-merged.mount: Deactivated successfully.
Nov 25 06:24:35 compute-0 podman[214384]: 2025-11-25 06:24:35.746714149 +0000 UTC m=+0.037512764 container cleanup 623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2)
Nov 25 06:24:35 compute-0 systemd[1]: libpod-conmon-623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8.scope: Deactivated successfully.
Nov 25 06:24:35 compute-0 podman[214385]: 2025-11-25 06:24:35.754876841 +0000 UTC m=+0.042219028 container remove 623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.758 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b1edfc9d-5ef7-4fc3-b5dc-99ddb7005da8]: (4, ("Tue Nov 25 06:24:35 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4 (623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8)\n623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8\nTue Nov 25 06:24:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4 (623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8)\n623713e8daaf510ca3602682639dabd38e9429ae085d08a3626384681d2eace8\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.759 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd1c0c5-8988-4fa7-9313-ddaeef02f49f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.760 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f088e83e-6869-485f-aff5-47d816c267b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.760 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[713a4b95-0964-4036-aa1c-027030d549cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.761 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf088e83e-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.762 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:35 compute-0 kernel: tapf088e83e-60: left promiscuous mode
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.776 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.780 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:35 compute-0 NetworkManager[55345]: <info>  [1764051875.7822] manager: (tap17e014dc-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.782 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bc22ba0c-36e2-445d-b920-164a55316f02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.793 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[0a45981b-3366-47ad-aaab-44b2443188d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.794 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e593f9-a3ac-4b51-ad21-14048294298c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.805 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[af93d78b-c7be-4167-952c-a85b72d95ce0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 278608, 'reachable_time': 16007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214420, 'error': None, 'target': 'ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 systemd[1]: run-netns-ovnmeta\x2df088e83e\x2d6869\x2d485f\x2daff5\x2d47d816c267b4.mount: Deactivated successfully.
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.806 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f088e83e-6869-485f-aff5-47d816c267b4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:24:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:35.806 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3a6186-ba7a-42bb-bdf4-49160705311e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.810 186245 INFO nova.virt.libvirt.driver [-] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Instance destroyed successfully.
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.811 186245 DEBUG nova.objects.instance [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid b79ea1d7-d6e1-430b-82bf-566447f159f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.842 186245 DEBUG nova.compute.manager [req-603b1699-2e4a-4890-bfdb-441e49198845 req-68c0b3e2-6964-4f65-aff5-2b901e34a48b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-vif-unplugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.842 186245 DEBUG oslo_concurrency.lockutils [req-603b1699-2e4a-4890-bfdb-441e49198845 req-68c0b3e2-6964-4f65-aff5-2b901e34a48b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.843 186245 DEBUG oslo_concurrency.lockutils [req-603b1699-2e4a-4890-bfdb-441e49198845 req-68c0b3e2-6964-4f65-aff5-2b901e34a48b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.843 186245 DEBUG oslo_concurrency.lockutils [req-603b1699-2e4a-4890-bfdb-441e49198845 req-68c0b3e2-6964-4f65-aff5-2b901e34a48b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.843 186245 DEBUG nova.compute.manager [req-603b1699-2e4a-4890-bfdb-441e49198845 req-68c0b3e2-6964-4f65-aff5-2b901e34a48b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] No waiting events found dispatching network-vif-unplugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:24:35 compute-0 nova_compute[186241]: 2025-11-25 06:24:35.843 186245 DEBUG nova.compute.manager [req-603b1699-2e4a-4890-bfdb-441e49198845 req-68c0b3e2-6964-4f65-aff5-2b901e34a48b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-vif-unplugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.313 186245 DEBUG nova.virt.libvirt.vif [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-112305377',display_name='tempest-TestNetworkBasicOps-server-112305377',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-112305377',id=4,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOcCCM5XX449ERaNbK92qLPvVLH1Xsp1m2F1vTT92DeaMDd7WOtWX4CV3c9DgYE1GaAP6//Jn1dzZvGo29HLczF+oNP7IRiMbkWTtn2RSSpZ1JyMvXiH3LfhFpiCqACiqw==',key_name='tempest-TestNetworkBasicOps-2120882882',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:23:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-zao59wro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:23:18Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=b79ea1d7-d6e1-430b-82bf-566447f159f3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.314 186245 DEBUG nova.network.os_vif_util [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "address": "fa:16:3e:44:4d:aa", "network": {"id": "f088e83e-6869-485f-aff5-47d816c267b4", "bridge": "br-int", "label": "tempest-network-smoke--1881386856", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17e014dc-1f", "ovs_interfaceid": "17e014dc-1fe1-4091-95b3-3c08eb9abbb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.314 186245 DEBUG nova.network.os_vif_util [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:4d:aa,bridge_name='br-int',has_traffic_filtering=True,id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17e014dc-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.315 186245 DEBUG os_vif [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:4d:aa,bridge_name='br-int',has_traffic_filtering=True,id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17e014dc-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.316 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.316 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17e014dc-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.317 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.318 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.319 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.319 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=085dad69-ccf0-424b-bc4e-53f583c9421d) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.320 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.320 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.322 186245 INFO os_vif [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:4d:aa,bridge_name='br-int',has_traffic_filtering=True,id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2,network=Network(f088e83e-6869-485f-aff5-47d816c267b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17e014dc-1f')
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.322 186245 INFO nova.virt.libvirt.driver [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Deleting instance files /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3_del
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.323 186245 INFO nova.virt.libvirt.driver [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Deletion of /var/lib/nova/instances/b79ea1d7-d6e1-430b-82bf-566447f159f3_del complete
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.830 186245 INFO nova.compute.manager [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.830 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.831 186245 DEBUG nova.compute.manager [-] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.831 186245 DEBUG nova.network.neutron [-] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:24:36 compute-0 nova_compute[186241]: 2025-11-25 06:24:36.961 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:37 compute-0 nova_compute[186241]: 2025-11-25 06:24:37.734 186245 DEBUG nova.compute.manager [req-1d389aac-5621-4e17-affe-f5da6a308069 req-6caf4be3-7fa7-43f0-8d7c-86371492fc60 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-vif-deleted-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:37 compute-0 nova_compute[186241]: 2025-11-25 06:24:37.734 186245 INFO nova.compute.manager [req-1d389aac-5621-4e17-affe-f5da6a308069 req-6caf4be3-7fa7-43f0-8d7c-86371492fc60 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Neutron deleted interface 17e014dc-1fe1-4091-95b3-3c08eb9abbb2; detaching it from the instance and deleting it from the info cache
Nov 25 06:24:37 compute-0 nova_compute[186241]: 2025-11-25 06:24:37.734 186245 DEBUG nova.network.neutron [req-1d389aac-5621-4e17-affe-f5da6a308069 req-6caf4be3-7fa7-43f0-8d7c-86371492fc60 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.007 186245 DEBUG nova.compute.manager [req-dbe15593-593a-4e89-bd6f-bbdd26e77835 req-3388cb78-6f22-400d-a70d-dc3fd30fd3c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received event network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.007 186245 DEBUG oslo_concurrency.lockutils [req-dbe15593-593a-4e89-bd6f-bbdd26e77835 req-3388cb78-6f22-400d-a70d-dc3fd30fd3c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.007 186245 DEBUG oslo_concurrency.lockutils [req-dbe15593-593a-4e89-bd6f-bbdd26e77835 req-3388cb78-6f22-400d-a70d-dc3fd30fd3c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.008 186245 DEBUG oslo_concurrency.lockutils [req-dbe15593-593a-4e89-bd6f-bbdd26e77835 req-3388cb78-6f22-400d-a70d-dc3fd30fd3c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.008 186245 DEBUG nova.compute.manager [req-dbe15593-593a-4e89-bd6f-bbdd26e77835 req-3388cb78-6f22-400d-a70d-dc3fd30fd3c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] No waiting events found dispatching network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.008 186245 WARNING nova.compute.manager [req-dbe15593-593a-4e89-bd6f-bbdd26e77835 req-3388cb78-6f22-400d-a70d-dc3fd30fd3c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Received unexpected event network-vif-plugged-17e014dc-1fe1-4091-95b3-3c08eb9abbb2 for instance with vm_state active and task_state deleting.
Nov 25 06:24:38 compute-0 podman[214426]: 2025-11-25 06:24:38.060943474 +0000 UTC m=+0.040458903 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.104 186245 DEBUG nova.network.neutron [-] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.239 186245 DEBUG nova.compute.manager [req-1d389aac-5621-4e17-affe-f5da6a308069 req-6caf4be3-7fa7-43f0-8d7c-86371492fc60 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Detach interface failed, port_id=17e014dc-1fe1-4091-95b3-3c08eb9abbb2, reason: Instance b79ea1d7-d6e1-430b-82bf-566447f159f3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:24:38 compute-0 nova_compute[186241]: 2025-11-25 06:24:38.607 186245 INFO nova.compute.manager [-] [instance: b79ea1d7-d6e1-430b-82bf-566447f159f3] Took 1.78 seconds to deallocate network for instance.
Nov 25 06:24:39 compute-0 nova_compute[186241]: 2025-11-25 06:24:39.112 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:39 compute-0 nova_compute[186241]: 2025-11-25 06:24:39.113 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:39 compute-0 nova_compute[186241]: 2025-11-25 06:24:39.147 186245 DEBUG nova.compute.provider_tree [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:24:39 compute-0 nova_compute[186241]: 2025-11-25 06:24:39.650 186245 DEBUG nova.scheduler.client.report [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:24:40 compute-0 nova_compute[186241]: 2025-11-25 06:24:40.156 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:40 compute-0 nova_compute[186241]: 2025-11-25 06:24:40.173 186245 INFO nova.scheduler.client.report [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance b79ea1d7-d6e1-430b-82bf-566447f159f3
Nov 25 06:24:41 compute-0 nova_compute[186241]: 2025-11-25 06:24:41.182 186245 DEBUG oslo_concurrency.lockutils [None req-c5032ea8-3f59-41c9-b893-b2325375601d 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "b79ea1d7-d6e1-430b-82bf-566447f159f3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:41 compute-0 nova_compute[186241]: 2025-11-25 06:24:41.321 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:41 compute-0 nova_compute[186241]: 2025-11-25 06:24:41.963 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:43.297 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:24:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:43.297 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:24:43 compute-0 nova_compute[186241]: 2025-11-25 06:24:43.299 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:44 compute-0 podman[214445]: 2025-11-25 06:24:44.059935106 +0000 UTC m=+0.038505203 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:24:46 compute-0 nova_compute[186241]: 2025-11-25 06:24:46.323 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:46 compute-0 nova_compute[186241]: 2025-11-25 06:24:46.567 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:46 compute-0 nova_compute[186241]: 2025-11-25 06:24:46.644 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:46 compute-0 nova_compute[186241]: 2025-11-25 06:24:46.966 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:47.465 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:24:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:47.465 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:24:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:47.465 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:24:49 compute-0 podman[214464]: 2025-11-25 06:24:49.057166426 +0000 UTC m=+0.037188224 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:24:49 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:49.298 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:24:51 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:51.272 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:4d:2e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d69e634-57f2-49e0-8c89-35d178b67c36, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=adae0e10-6930-4117-9b4f-dd5ad7e75d7a) old=Port_Binding(mac=['fa:16:3e:2b:4d:2e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:24:51 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:51.274 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port adae0e10-6930-4117-9b4f-dd5ad7e75d7a in datapath bdd0af2e-c79c-421a-a113-be4d7ab826e9 updated
Nov 25 06:24:51 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:51.275 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bdd0af2e-c79c-421a-a113-be4d7ab826e9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:24:51 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:24:51.276 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4aabdeec-dc82-491f-9f43-dfe05bd72bfa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:24:51 compute-0 nova_compute[186241]: 2025-11-25 06:24:51.325 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:51 compute-0 nova_compute[186241]: 2025-11-25 06:24:51.967 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:56 compute-0 nova_compute[186241]: 2025-11-25 06:24:56.327 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:24:56 compute-0 podman[214485]: 2025-11-25 06:24:56.397345733 +0000 UTC m=+0.053524377 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:24:56 compute-0 nova_compute[186241]: 2025-11-25 06:24:56.969 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:01 compute-0 podman[214509]: 2025-11-25 06:25:01.060938583 +0000 UTC m=+0.038382996 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:25:01 compute-0 podman[214508]: 2025-11-25 06:25:01.067961893 +0000 UTC m=+0.046364218 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:25:01 compute-0 nova_compute[186241]: 2025-11-25 06:25:01.329 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:01 compute-0 nova_compute[186241]: 2025-11-25 06:25:01.969 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:03 compute-0 nova_compute[186241]: 2025-11-25 06:25:03.132 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:03 compute-0 nova_compute[186241]: 2025-11-25 06:25:03.132 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:03 compute-0 nova_compute[186241]: 2025-11-25 06:25:03.635 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:25:04 compute-0 podman[214546]: 2025-11-25 06:25:04.057017644 +0000 UTC m=+0.037224198 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 25 06:25:04 compute-0 nova_compute[186241]: 2025-11-25 06:25:04.162 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:04 compute-0 nova_compute[186241]: 2025-11-25 06:25:04.162 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:04 compute-0 nova_compute[186241]: 2025-11-25 06:25:04.168 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:25:04 compute-0 nova_compute[186241]: 2025-11-25 06:25:04.168 186245 INFO nova.compute.claims [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:25:05 compute-0 nova_compute[186241]: 2025-11-25 06:25:05.204 186245 DEBUG nova.compute.provider_tree [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:25:05 compute-0 nova_compute[186241]: 2025-11-25 06:25:05.708 186245 DEBUG nova.scheduler.client.report [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:25:06 compute-0 nova_compute[186241]: 2025-11-25 06:25:06.213 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:06 compute-0 nova_compute[186241]: 2025-11-25 06:25:06.214 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:25:06 compute-0 nova_compute[186241]: 2025-11-25 06:25:06.330 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:06 compute-0 nova_compute[186241]: 2025-11-25 06:25:06.719 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:25:06 compute-0 nova_compute[186241]: 2025-11-25 06:25:06.719 186245 DEBUG nova.network.neutron [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:25:06 compute-0 nova_compute[186241]: 2025-11-25 06:25:06.971 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:07 compute-0 nova_compute[186241]: 2025-11-25 06:25:07.221 186245 DEBUG nova.policy [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:25:07 compute-0 nova_compute[186241]: 2025-11-25 06:25:07.223 186245 INFO nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:25:07 compute-0 nova_compute[186241]: 2025-11-25 06:25:07.726 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.269 186245 DEBUG nova.network.neutron [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Successfully created port: 83e4beda-0cfb-4824-8d25-0345811c9a67 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.736 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.737 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.738 186245 INFO nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Creating image(s)
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.738 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.738 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.739 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.739 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.742 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.743 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.787 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.788 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.788 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.789 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.792 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.792 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.835 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.835 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.854 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.855 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.856 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.892 186245 DEBUG nova.network.neutron [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Successfully updated port: 83e4beda-0cfb-4824-8d25-0345811c9a67 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.898 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.899 186245 DEBUG nova.virt.disk.api [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.899 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.942 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.943 186245 DEBUG nova.virt.disk.api [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.944 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.944 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Ensure instance console log exists: /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.944 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.945 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:08 compute-0 nova_compute[186241]: 2025-11-25 06:25:08.945 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:09 compute-0 nova_compute[186241]: 2025-11-25 06:25:09.038 186245 DEBUG nova.compute.manager [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-changed-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:25:09 compute-0 nova_compute[186241]: 2025-11-25 06:25:09.038 186245 DEBUG nova.compute.manager [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing instance network info cache due to event network-changed-83e4beda-0cfb-4824-8d25-0345811c9a67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:25:09 compute-0 nova_compute[186241]: 2025-11-25 06:25:09.039 186245 DEBUG oslo_concurrency.lockutils [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:25:09 compute-0 nova_compute[186241]: 2025-11-25 06:25:09.039 186245 DEBUG oslo_concurrency.lockutils [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:25:09 compute-0 nova_compute[186241]: 2025-11-25 06:25:09.039 186245 DEBUG nova.network.neutron [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing network info cache for port 83e4beda-0cfb-4824-8d25-0345811c9a67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:25:09 compute-0 podman[214577]: 2025-11-25 06:25:09.057127828 +0000 UTC m=+0.037284080 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 06:25:09 compute-0 nova_compute[186241]: 2025-11-25 06:25:09.396 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:25:10 compute-0 nova_compute[186241]: 2025-11-25 06:25:10.123 186245 DEBUG nova.network.neutron [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:25:11 compute-0 nova_compute[186241]: 2025-11-25 06:25:11.332 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:11 compute-0 nova_compute[186241]: 2025-11-25 06:25:11.973 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:12 compute-0 nova_compute[186241]: 2025-11-25 06:25:12.203 186245 DEBUG nova.network.neutron [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:25:12 compute-0 nova_compute[186241]: 2025-11-25 06:25:12.706 186245 DEBUG oslo_concurrency.lockutils [req-4a6deac7-d95d-4b74-9023-ebc8f95f383e req-8c5980f4-0579-4624-9153-67fea0dccd3c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:25:12 compute-0 nova_compute[186241]: 2025-11-25 06:25:12.707 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:25:12 compute-0 nova_compute[186241]: 2025-11-25 06:25:12.707 186245 DEBUG nova.network.neutron [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:25:13 compute-0 nova_compute[186241]: 2025-11-25 06:25:13.501 186245 DEBUG nova.network.neutron [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:25:15 compute-0 podman[214595]: 2025-11-25 06:25:15.061893475 +0000 UTC m=+0.041607161 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.371 186245 DEBUG nova.network.neutron [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.874 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.874 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Instance network_info: |[{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.876 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Start _get_guest_xml network_info=[{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.878 186245 WARNING nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.879 186245 DEBUG nova.virt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-2139323515', uuid='90a703a7-09d1-4f58-84e5-80f4083b5922'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051915.8794138) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.884 186245 DEBUG nova.virt.libvirt.host [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.885 186245 DEBUG nova.virt.libvirt.host [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.887 186245 DEBUG nova.virt.libvirt.host [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.887 186245 DEBUG nova.virt.libvirt.host [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.888 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.888 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.888 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.888 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.889 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.889 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.889 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.889 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.889 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.889 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.890 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.890 186245 DEBUG nova.virt.hardware [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.892 186245 DEBUG nova.virt.libvirt.vif [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:25:07Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.892 186245 DEBUG nova.network.os_vif_util [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.893 186245 DEBUG nova.network.os_vif_util [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:a7:44,bridge_name='br-int',has_traffic_filtering=True,id=83e4beda-0cfb-4824-8d25-0345811c9a67,network=Network(bdd0af2e-c79c-421a-a113-be4d7ab826e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83e4beda-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.894 186245 DEBUG nova.objects.instance [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:15 compute-0 nova_compute[186241]: 2025-11-25 06:25:15.931 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.334 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.397 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <uuid>90a703a7-09d1-4f58-84e5-80f4083b5922</uuid>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <name>instance-00000006</name>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:25:15</nova:creationTime>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:25:16 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <system>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <entry name="serial">90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <entry name="uuid">90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </system>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <os>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   </os>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <features>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   </features>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:a2:a7:44"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <target dev="tap83e4beda-0c"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log" append="off"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <video>
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </video>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:25:16 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:25:16 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:25:16 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:25:16 compute-0 nova_compute[186241]: </domain>
Nov 25 06:25:16 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.398 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Preparing to wait for external event network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.398 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.398 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.398 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.399 186245 DEBUG nova.virt.libvirt.vif [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:25:07Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.399 186245 DEBUG nova.network.os_vif_util [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.400 186245 DEBUG nova.network.os_vif_util [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:a7:44,bridge_name='br-int',has_traffic_filtering=True,id=83e4beda-0cfb-4824-8d25-0345811c9a67,network=Network(bdd0af2e-c79c-421a-a113-be4d7ab826e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83e4beda-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.400 186245 DEBUG os_vif [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:a7:44,bridge_name='br-int',has_traffic_filtering=True,id=83e4beda-0cfb-4824-8d25-0345811c9a67,network=Network(bdd0af2e-c79c-421a-a113-be4d7ab826e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83e4beda-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.400 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.401 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.401 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.401 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.402 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9a1b86a3-3852-50b8-9ec2-09859d171e57', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.402 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.403 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.405 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.405 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83e4beda-0c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.405 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap83e4beda-0c, col_values=(('qos', UUID('526a557f-6104-4183-92ec-dc176bfd84ad')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.406 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap83e4beda-0c, col_values=(('external_ids', {'iface-id': '83e4beda-0cfb-4824-8d25-0345811c9a67', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:a7:44', 'vm-uuid': '90a703a7-09d1-4f58-84e5-80f4083b5922'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:16 compute-0 NetworkManager[55345]: <info>  [1764051916.4075] manager: (tap83e4beda-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.406 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.410 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.411 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.411 186245 INFO os_vif [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:a7:44,bridge_name='br-int',has_traffic_filtering=True,id=83e4beda-0cfb-4824-8d25-0345811c9a67,network=Network(bdd0af2e-c79c-421a-a113-be4d7ab826e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83e4beda-0c')
Nov 25 06:25:16 compute-0 nova_compute[186241]: 2025-11-25 06:25:16.975 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:17 compute-0 nova_compute[186241]: 2025-11-25 06:25:17.935 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:25:17 compute-0 nova_compute[186241]: 2025-11-25 06:25:17.935 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:25:17 compute-0 nova_compute[186241]: 2025-11-25 06:25:17.936 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:a2:a7:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:25:17 compute-0 nova_compute[186241]: 2025-11-25 06:25:17.936 186245 INFO nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Using config drive
Nov 25 06:25:19 compute-0 nova_compute[186241]: 2025-11-25 06:25:19.632 186245 INFO nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Creating config drive at /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config
Nov 25 06:25:19 compute-0 nova_compute[186241]: 2025-11-25 06:25:19.637 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpvilise1p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:19 compute-0 nova_compute[186241]: 2025-11-25 06:25:19.753 186245 DEBUG oslo_concurrency.processutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpvilise1p" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:19 compute-0 kernel: tap83e4beda-0c: entered promiscuous mode
Nov 25 06:25:19 compute-0 NetworkManager[55345]: <info>  [1764051919.7972] manager: (tap83e4beda-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Nov 25 06:25:19 compute-0 ovn_controller[95135]: 2025-11-25T06:25:19Z|00096|binding|INFO|Claiming lport 83e4beda-0cfb-4824-8d25-0345811c9a67 for this chassis.
Nov 25 06:25:19 compute-0 ovn_controller[95135]: 2025-11-25T06:25:19Z|00097|binding|INFO|83e4beda-0cfb-4824-8d25-0345811c9a67: Claiming fa:16:3e:a2:a7:44 10.100.0.5
Nov 25 06:25:19 compute-0 nova_compute[186241]: 2025-11-25 06:25:19.806 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:19 compute-0 nova_compute[186241]: 2025-11-25 06:25:19.808 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.814 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:a7:44 10.100.0.5'], port_security=['fa:16:3e:a2:a7:44 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '90a703a7-09d1-4f58-84e5-80f4083b5922', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aee7b322-406a-47f1-954e-0d371991f172', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d69e634-57f2-49e0-8c89-35d178b67c36, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=83e4beda-0cfb-4824-8d25-0345811c9a67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.815 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 83e4beda-0cfb-4824-8d25-0345811c9a67 in datapath bdd0af2e-c79c-421a-a113-be4d7ab826e9 bound to our chassis
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.816 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bdd0af2e-c79c-421a-a113-be4d7ab826e9
Nov 25 06:25:19 compute-0 systemd-udevd[214640]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:25:19 compute-0 systemd-machined[152921]: New machine qemu-6-instance-00000006.
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.835 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[55fd0109-1008-4464-ae0e-e6db5035cd25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.836 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbdd0af2e-c1 in ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.837 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbdd0af2e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.838 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4f458ee4-eade-42cb-949e-c8866a889e8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.838 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[418a0ee4-abfb-4543-8e6d-7ffa6d8fb4a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 NetworkManager[55345]: <info>  [1764051919.8464] device (tap83e4beda-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:25:19 compute-0 NetworkManager[55345]: <info>  [1764051919.8470] device (tap83e4beda-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.845 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[5343aacf-68db-4129-8c2a-1a9a5272d909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Nov 25 06:25:19 compute-0 podman[214624]: 2025-11-25 06:25:19.868170618 +0000 UTC m=+0.068041202 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.875 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1fa0f3-4aa0-4fda-98f9-bc6e44cbcccb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_controller[95135]: 2025-11-25T06:25:19Z|00098|binding|INFO|Setting lport 83e4beda-0cfb-4824-8d25-0345811c9a67 ovn-installed in OVS
Nov 25 06:25:19 compute-0 ovn_controller[95135]: 2025-11-25T06:25:19Z|00099|binding|INFO|Setting lport 83e4beda-0cfb-4824-8d25-0345811c9a67 up in Southbound
Nov 25 06:25:19 compute-0 nova_compute[186241]: 2025-11-25 06:25:19.879 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.895 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[13efa2dc-091c-4f43-9465-2d023fafc4fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 systemd-udevd[214649]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:25:19 compute-0 NetworkManager[55345]: <info>  [1764051919.8999] manager: (tapbdd0af2e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.900 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d01145b5-994e-4c64-866b-8ad966ed0897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.922 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[8e8b82db-f33d-4f64-bbee-80c028de23ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.924 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4af401-3f98-4552-b2ca-543b37cefc9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 NetworkManager[55345]: <info>  [1764051919.9395] device (tapbdd0af2e-c0): carrier: link connected
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.942 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[e21b206c-1131-40dc-84cc-ac3794bd0118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.955 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c138a4e9-284b-4d17-8045-bb9a0f801361]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbdd0af2e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4d:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290834, 'reachable_time': 39041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214677, 'error': None, 'target': 'ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.966 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ea92484c-f824-4ba5-b645-3d9b658664a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:4d2e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 290834, 'tstamp': 290834}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214678, 'error': None, 'target': 'ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:19.979 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e7425554-3657-4918-93a8-ed8ddd58b25e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbdd0af2e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4d:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290834, 'reachable_time': 39041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214679, 'error': None, 'target': 'ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.001 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3aa55d-1b86-404d-bb03-def78e821760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.038 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d5d3d9-184a-4627-bba8-7df8ac4b65f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.039 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbdd0af2e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.039 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.039 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbdd0af2e-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:20 compute-0 NetworkManager[55345]: <info>  [1764051920.0417] manager: (tapbdd0af2e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 25 06:25:20 compute-0 kernel: tapbdd0af2e-c0: entered promiscuous mode
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.041 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.046 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbdd0af2e-c0, col_values=(('external_ids', {'iface-id': 'adae0e10-6930-4117-9b4f-dd5ad7e75d7a'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:25:20 compute-0 ovn_controller[95135]: 2025-11-25T06:25:20Z|00100|binding|INFO|Releasing lport adae0e10-6930-4117-9b4f-dd5ad7e75d7a from this chassis (sb_readonly=0)
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.047 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.049 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[99693829-be52-45ba-aae9-62cdce36431d]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.050 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.050 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.050 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for bdd0af2e-c79c-421a-a113-be4d7ab826e9 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.050 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.051 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7256b84e-1064-4a68-ba77-7684449e1065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.051 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.051 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c3cc4cb6-3959-4f63-b9da-1f681ede4c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.052 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-bdd0af2e-c79c-421a-a113-be4d7ab826e9
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID bdd0af2e-c79c-421a-a113-be4d7ab826e9
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:25:20 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:20.053 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'env', 'PROCESS_TAG=haproxy-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bdd0af2e-c79c-421a-a113-be4d7ab826e9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.059 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.090 186245 DEBUG nova.compute.manager [req-f63c5322-16e0-4c4e-87fb-53c794934f87 req-b0f1be8a-8d8d-427f-96f2-733edf261b6e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.090 186245 DEBUG oslo_concurrency.lockutils [req-f63c5322-16e0-4c4e-87fb-53c794934f87 req-b0f1be8a-8d8d-427f-96f2-733edf261b6e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.090 186245 DEBUG oslo_concurrency.lockutils [req-f63c5322-16e0-4c4e-87fb-53c794934f87 req-b0f1be8a-8d8d-427f-96f2-733edf261b6e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.091 186245 DEBUG oslo_concurrency.lockutils [req-f63c5322-16e0-4c4e-87fb-53c794934f87 req-b0f1be8a-8d8d-427f-96f2-733edf261b6e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.091 186245 DEBUG nova.compute.manager [req-f63c5322-16e0-4c4e-87fb-53c794934f87 req-b0f1be8a-8d8d-427f-96f2-733edf261b6e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Processing event network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:25:20 compute-0 podman[214707]: 2025-11-25 06:25:20.321906643 +0000 UTC m=+0.028225991 container create bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 06:25:20 compute-0 systemd[1]: Started libpod-conmon-bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49.scope.
Nov 25 06:25:20 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:25:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07ab3e65389ddb6ffb244ead935183eee96a6bcc9748b35cbe3f2448b6efe311/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:25:20 compute-0 podman[214707]: 2025-11-25 06:25:20.374797548 +0000 UTC m=+0.081116917 container init bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 25 06:25:20 compute-0 podman[214707]: 2025-11-25 06:25:20.37954602 +0000 UTC m=+0.085865369 container start bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:25:20 compute-0 podman[214707]: 2025-11-25 06:25:20.309130301 +0000 UTC m=+0.015449670 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:25:20 compute-0 neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9[214719]: [NOTICE]   (214723) : New worker (214725) forked
Nov 25 06:25:20 compute-0 neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9[214719]: [NOTICE]   (214723) : Loading success.
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.698 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.701 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.703 186245 INFO nova.virt.libvirt.driver [-] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Instance spawned successfully.
Nov 25 06:25:20 compute-0 nova_compute[186241]: 2025-11-25 06:25:20.703 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.210 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.210 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.211 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.211 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.211 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.211 186245 DEBUG nova.virt.libvirt.driver [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.408 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.433 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.434 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.717 186245 INFO nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Took 12.98 seconds to spawn the instance on the hypervisor.
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.718 186245 DEBUG nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:21 compute-0 nova_compute[186241]: 2025-11-25 06:25:21.977 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.230 186245 INFO nova.compute.manager [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Took 18.09 seconds to build instance.
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.240 186245 DEBUG nova.compute.manager [req-c1868945-8c8a-4964-860d-c6764024f22e req-e6ce1e05-f861-4a84-80a0-ef6137003203 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.240 186245 DEBUG oslo_concurrency.lockutils [req-c1868945-8c8a-4964-860d-c6764024f22e req-e6ce1e05-f861-4a84-80a0-ef6137003203 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.240 186245 DEBUG oslo_concurrency.lockutils [req-c1868945-8c8a-4964-860d-c6764024f22e req-e6ce1e05-f861-4a84-80a0-ef6137003203 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.241 186245 DEBUG oslo_concurrency.lockutils [req-c1868945-8c8a-4964-860d-c6764024f22e req-e6ce1e05-f861-4a84-80a0-ef6137003203 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.241 186245 DEBUG nova.compute.manager [req-c1868945-8c8a-4964-860d-c6764024f22e req-e6ce1e05-f861-4a84-80a0-ef6137003203 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] No waiting events found dispatching network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.241 186245 WARNING nova.compute.manager [req-c1868945-8c8a-4964-860d-c6764024f22e req-e6ce1e05-f861-4a84-80a0-ef6137003203 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received unexpected event network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 for instance with vm_state active and task_state None.
Nov 25 06:25:22 compute-0 nova_compute[186241]: 2025-11-25 06:25:22.732 186245 DEBUG oslo_concurrency.lockutils [None req-9315a23c-abf8-4ac1-bbe4-bc6bdbcfcc00 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:23 compute-0 nova_compute[186241]: 2025-11-25 06:25:23.434 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:23 compute-0 nova_compute[186241]: 2025-11-25 06:25:23.434 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:23 compute-0 nova_compute[186241]: 2025-11-25 06:25:23.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:23 compute-0 nova_compute[186241]: 2025-11-25 06:25:23.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:25:23 compute-0 nova_compute[186241]: 2025-11-25 06:25:23.933 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:24 compute-0 nova_compute[186241]: 2025-11-25 06:25:24.442 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:24 compute-0 nova_compute[186241]: 2025-11-25 06:25:24.442 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:24 compute-0 nova_compute[186241]: 2025-11-25 06:25:24.442 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:24 compute-0 nova_compute[186241]: 2025-11-25 06:25:24.443 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.466 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.524 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.524 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.581 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.775 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.776 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5587MB free_disk=73.02111434936523GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.776 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:25 compute-0 nova_compute[186241]: 2025-11-25 06:25:25.777 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:26 compute-0 nova_compute[186241]: 2025-11-25 06:25:26.410 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:26 compute-0 nova_compute[186241]: 2025-11-25 06:25:26.813 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 90a703a7-09d1-4f58-84e5-80f4083b5922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:25:26 compute-0 nova_compute[186241]: 2025-11-25 06:25:26.813 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:25:26 compute-0 nova_compute[186241]: 2025-11-25 06:25:26.814 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:25:26 compute-0 nova_compute[186241]: 2025-11-25 06:25:26.854 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:25:26 compute-0 nova_compute[186241]: 2025-11-25 06:25:26.977 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:27 compute-0 podman[214744]: 2025-11-25 06:25:27.079919129 +0000 UTC m=+0.057554798 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 06:25:27 compute-0 nova_compute[186241]: 2025-11-25 06:25:27.357 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:25:27 compute-0 nova_compute[186241]: 2025-11-25 06:25:27.712 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:27 compute-0 NetworkManager[55345]: <info>  [1764051927.7168] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Nov 25 06:25:27 compute-0 ovn_controller[95135]: 2025-11-25T06:25:27Z|00101|binding|INFO|Releasing lport adae0e10-6930-4117-9b4f-dd5ad7e75d7a from this chassis (sb_readonly=0)
Nov 25 06:25:27 compute-0 NetworkManager[55345]: <info>  [1764051927.7176] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Nov 25 06:25:27 compute-0 ovn_controller[95135]: 2025-11-25T06:25:27Z|00102|binding|INFO|Releasing lport adae0e10-6930-4117-9b4f-dd5ad7e75d7a from this chassis (sb_readonly=0)
Nov 25 06:25:27 compute-0 nova_compute[186241]: 2025-11-25 06:25:27.727 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:27 compute-0 nova_compute[186241]: 2025-11-25 06:25:27.731 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:27 compute-0 nova_compute[186241]: 2025-11-25 06:25:27.863 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:25:27 compute-0 nova_compute[186241]: 2025-11-25 06:25:27.864 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:28 compute-0 nova_compute[186241]: 2025-11-25 06:25:28.091 186245 DEBUG nova.compute.manager [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-changed-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:25:28 compute-0 nova_compute[186241]: 2025-11-25 06:25:28.091 186245 DEBUG nova.compute.manager [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing instance network info cache due to event network-changed-83e4beda-0cfb-4824-8d25-0345811c9a67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:25:28 compute-0 nova_compute[186241]: 2025-11-25 06:25:28.091 186245 DEBUG oslo_concurrency.lockutils [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:25:28 compute-0 nova_compute[186241]: 2025-11-25 06:25:28.092 186245 DEBUG oslo_concurrency.lockutils [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:25:28 compute-0 nova_compute[186241]: 2025-11-25 06:25:28.092 186245 DEBUG nova.network.neutron [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing network info cache for port 83e4beda-0cfb-4824-8d25-0345811c9a67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:25:28 compute-0 nova_compute[186241]: 2025-11-25 06:25:28.859 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:28 compute-0 nova_compute[186241]: 2025-11-25 06:25:28.859 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:29 compute-0 nova_compute[186241]: 2025-11-25 06:25:29.366 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:29 compute-0 nova_compute[186241]: 2025-11-25 06:25:29.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:29 compute-0 nova_compute[186241]: 2025-11-25 06:25:29.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Nov 25 06:25:30 compute-0 nova_compute[186241]: 2025-11-25 06:25:30.435 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Nov 25 06:25:31 compute-0 ovn_controller[95135]: 2025-11-25T06:25:31Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:a7:44 10.100.0.5
Nov 25 06:25:31 compute-0 ovn_controller[95135]: 2025-11-25T06:25:31Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:a7:44 10.100.0.5
Nov 25 06:25:31 compute-0 nova_compute[186241]: 2025-11-25 06:25:31.413 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:31 compute-0 nova_compute[186241]: 2025-11-25 06:25:31.978 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:32 compute-0 podman[214782]: 2025-11-25 06:25:32.067003461 +0000 UTC m=+0.040038682 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:25:32 compute-0 podman[214781]: 2025-11-25 06:25:32.067324095 +0000 UTC m=+0.042640075 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 25 06:25:32 compute-0 nova_compute[186241]: 2025-11-25 06:25:32.227 186245 DEBUG nova.network.neutron [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updated VIF entry in instance network info cache for port 83e4beda-0cfb-4824-8d25-0345811c9a67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:25:32 compute-0 nova_compute[186241]: 2025-11-25 06:25:32.227 186245 DEBUG nova.network.neutron [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:25:32 compute-0 nova_compute[186241]: 2025-11-25 06:25:32.730 186245 DEBUG oslo_concurrency.lockutils [req-42a95a43-5aee-449a-ab64-ea81faad6394 req-520fbc5b-5d3a-47d6-b3f8-3e2f1cbeb8f0 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:25:35 compute-0 podman[214820]: 2025-11-25 06:25:35.05501544 +0000 UTC m=+0.033720317 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 06:25:36 compute-0 nova_compute[186241]: 2025-11-25 06:25:36.415 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:36 compute-0 nova_compute[186241]: 2025-11-25 06:25:36.979 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:38 compute-0 nova_compute[186241]: 2025-11-25 06:25:38.480 186245 INFO nova.compute.manager [None req-f548d233-6b6c-45f5-936e-deae190ecd64 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Get console output
Nov 25 06:25:38 compute-0 nova_compute[186241]: 2025-11-25 06:25:38.483 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:25:40 compute-0 podman[214836]: 2025-11-25 06:25:40.061869038 +0000 UTC m=+0.041900825 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 25 06:25:41 compute-0 nova_compute[186241]: 2025-11-25 06:25:41.417 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:41 compute-0 nova_compute[186241]: 2025-11-25 06:25:41.980 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:43.295 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:09:f3 10.100.0.17'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.17/28', 'neutron:device_id': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7764c441-3630-43ef-a835-62532c499c69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=376a7dc6-ccc2-4ff5-9992-66bd605dbeaf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d819e567-57aa-4c38-852c-35e41fc7980c) old=Port_Binding(mac=['fa:16:3e:62:09:f3'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7764c441-3630-43ef-a835-62532c499c69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:25:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:43.296 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d819e567-57aa-4c38-852c-35e41fc7980c in datapath 7764c441-3630-43ef-a835-62532c499c69 updated
Nov 25 06:25:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:43.297 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7764c441-3630-43ef-a835-62532c499c69, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:25:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:43.297 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b76ad572-a1b8-423c-92a3-9c89e4fb56d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:25:44 compute-0 nova_compute[186241]: 2025-11-25 06:25:44.392 186245 DEBUG oslo_concurrency.lockutils [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "interface-90a703a7-09d1-4f58-84e5-80f4083b5922-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:44 compute-0 nova_compute[186241]: 2025-11-25 06:25:44.393 186245 DEBUG oslo_concurrency.lockutils [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-90a703a7-09d1-4f58-84e5-80f4083b5922-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:44 compute-0 nova_compute[186241]: 2025-11-25 06:25:44.393 186245 DEBUG nova.objects.instance [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'flavor' on Instance uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:25:45 compute-0 nova_compute[186241]: 2025-11-25 06:25:45.531 186245 DEBUG nova.objects.instance [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_requests' on Instance uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:25:46 compute-0 nova_compute[186241]: 2025-11-25 06:25:46.034 186245 DEBUG nova.objects.base [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Object Instance<90a703a7-09d1-4f58-84e5-80f4083b5922> lazy-loaded attributes: flavor,pci_requests wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Nov 25 06:25:46 compute-0 nova_compute[186241]: 2025-11-25 06:25:46.034 186245 DEBUG nova.network.neutron [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:25:46 compute-0 podman[214854]: 2025-11-25 06:25:46.060197547 +0000 UTC m=+0.040345749 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:25:46 compute-0 nova_compute[186241]: 2025-11-25 06:25:46.419 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:46 compute-0 nova_compute[186241]: 2025-11-25 06:25:46.982 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:47 compute-0 nova_compute[186241]: 2025-11-25 06:25:47.236 186245 DEBUG nova.policy [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:25:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:47.526 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:47.526 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:25:47.527 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:48 compute-0 nova_compute[186241]: 2025-11-25 06:25:48.422 186245 DEBUG nova.network.neutron [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Successfully created port: 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:25:49 compute-0 nova_compute[186241]: 2025-11-25 06:25:49.590 186245 DEBUG nova.network.neutron [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Successfully updated port: 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:25:49 compute-0 nova_compute[186241]: 2025-11-25 06:25:49.783 186245 DEBUG nova.compute.manager [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-changed-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:25:49 compute-0 nova_compute[186241]: 2025-11-25 06:25:49.784 186245 DEBUG nova.compute.manager [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing instance network info cache due to event network-changed-6d3aa3ad-5f04-4c0f-bc86-9242dc134214. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:25:49 compute-0 nova_compute[186241]: 2025-11-25 06:25:49.784 186245 DEBUG oslo_concurrency.lockutils [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:25:49 compute-0 nova_compute[186241]: 2025-11-25 06:25:49.784 186245 DEBUG oslo_concurrency.lockutils [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:25:49 compute-0 nova_compute[186241]: 2025-11-25 06:25:49.784 186245 DEBUG nova.network.neutron [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing network info cache for port 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:25:50 compute-0 podman[214872]: 2025-11-25 06:25:50.058042009 +0000 UTC m=+0.038048198 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:25:50 compute-0 nova_compute[186241]: 2025-11-25 06:25:50.094 186245 DEBUG oslo_concurrency.lockutils [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:25:51 compute-0 nova_compute[186241]: 2025-11-25 06:25:51.420 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:51 compute-0 nova_compute[186241]: 2025-11-25 06:25:51.982 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:53 compute-0 nova_compute[186241]: 2025-11-25 06:25:53.767 186245 DEBUG nova.network.neutron [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Added VIF to instance network info cache for port 6d3aa3ad-5f04-4c0f-bc86-9242dc134214. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3546
Nov 25 06:25:53 compute-0 nova_compute[186241]: 2025-11-25 06:25:53.767 186245 DEBUG nova.network.neutron [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:25:54 compute-0 nova_compute[186241]: 2025-11-25 06:25:54.272 186245 DEBUG oslo_concurrency.lockutils [req-a8e6dd1f-fcb1-4083-a5ee-e7501418c0b3 req-817e03a7-3770-43b8-805c-8cdbb1abfdd9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:25:54 compute-0 nova_compute[186241]: 2025-11-25 06:25:54.273 186245 DEBUG oslo_concurrency.lockutils [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:25:54 compute-0 nova_compute[186241]: 2025-11-25 06:25:54.273 186245 DEBUG nova.network.neutron [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:25:55 compute-0 nova_compute[186241]: 2025-11-25 06:25:55.223 186245 WARNING nova.network.neutron [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] 7764c441-3630-43ef-a835-62532c499c69 already exists in list: networks containing: ['7764c441-3630-43ef-a835-62532c499c69']. ignoring it
Nov 25 06:25:55 compute-0 nova_compute[186241]: 2025-11-25 06:25:55.223 186245 WARNING nova.network.neutron [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 already exists in list: port_ids containing: ['6d3aa3ad-5f04-4c0f-bc86-9242dc134214']. ignoring it
Nov 25 06:25:56 compute-0 nova_compute[186241]: 2025-11-25 06:25:56.422 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:56 compute-0 nova_compute[186241]: 2025-11-25 06:25:56.985 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:25:57 compute-0 nova_compute[186241]: 2025-11-25 06:25:57.231 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:25:57 compute-0 nova_compute[186241]: 2025-11-25 06:25:57.737 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Triggering sync for uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10944
Nov 25 06:25:57 compute-0 nova_compute[186241]: 2025-11-25 06:25:57.737 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:25:57 compute-0 nova_compute[186241]: 2025-11-25 06:25:57.737 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:25:58 compute-0 podman[214893]: 2025-11-25 06:25:58.081944229 +0000 UTC m=+0.061661543 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Nov 25 06:25:58 compute-0 nova_compute[186241]: 2025-11-25 06:25:58.242 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:25:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:25:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:25:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:25:59.552 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/90a703a7-09d1-4f58-84e5-80f4083b5922 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e471cc3fc7ae9ac5d8fd794e8aefa20e5f5c77c3e3edccb41964d2d46a7818d3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.422 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 2146 Content-Type: application/json Date: Tue, 25 Nov 2025 06:25:59 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-4f26d707-f0fd-4fad-a566-bdc1777ec110 x-openstack-request-id: req-4f26d707-f0fd-4fad-a566-bdc1777ec110 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.423 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "90a703a7-09d1-4f58-84e5-80f4083b5922", "name": "tempest-TestNetworkBasicOps-server-2139323515", "status": "ACTIVE", "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "user_id": "66a05d0ca82146a5a458244c8e5364de", "metadata": {}, "hostId": "d6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5", "image": {"id": "5215c26e-be2f-40b4-ac47-476bfa3cf3f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/5215c26e-be2f-40b4-ac47-476bfa3cf3f2"}]}, "flavor": {"id": "53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac"}]}, "created": "2025-11-25T06:25:01Z", "updated": "2025-11-25T06:25:21Z", "addresses": {"tempest-network-smoke--82083730": [{"version": 4, "addr": "10.100.0.5", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:a2:a7:44"}, {"version": 4, "addr": "192.168.122.200", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:a2:a7:44"}], "tempest-network-smoke--723277504": [{"version": 4, "addr": "10.100.0.21", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:15:cf:0b"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/90a703a7-09d1-4f58-84e5-80f4083b5922"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/90a703a7-09d1-4f58-84e5-80f4083b5922"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-941751953", "OS-SRV-USG:launched_at": "2025-11-25T06:25:21.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "default"}, {"name": "tempest-secgroup-smoke-327898477"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000006", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.423 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/90a703a7-09d1-4f58-84e5-80f4083b5922 used request id req-4f26d707-f0fd-4fad-a566-bdc1777ec110 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.423 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '90a703a7-09d1-4f58-84e5-80f4083b5922', 'name': 'tempest-TestNetworkBasicOps-server-2139323515', 'flavor': {'id': '53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd90b557db9104ecfb816b1cdab8712bd', 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'hostId': 'd6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.424 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.424 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.424 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.424 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.425 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2025-11-25T06:26:00.424672) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.426 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 90a703a7-09d1-4f58-84e5-80f4083b5922 / tap83e4beda-0c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.426 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.outgoing.packets volume: 56 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.427 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.427 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.427 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.427 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.427 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.427 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.427 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2025-11-25T06:26:00.427533) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.444 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.write.bytes volume: 72962048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.444 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.444 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.444 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.445 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2025-11-25T06:26:00.445243) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2025-11-25T06:26:00.446136) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.446 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.447 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2025-11-25T06:26:00.446962) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.453 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.453 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.read.requests volume: 1056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.454 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2025-11-25T06:26:00.454640) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.455 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2025-11-25T06:26:00.455691) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.465 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/cpu volume: 9570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.466 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2025-11-25T06:26:00.466579) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2025-11-25T06:26:00.467440) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.467 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2025-11-25T06:26:00.468239) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.468 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2025-11-25T06:26:00.469057) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.469 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2025-11-25T06:26:00.469861) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2139323515>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2139323515>]
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.470 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.incoming.packets volume: 56 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2025-11-25T06:26:00.470631) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.incoming.bytes volume: 10422 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2025-11-25T06:26:00.471507) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.471 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2025-11-25T06:26:00.472306) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2139323515>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2139323515>]
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.472 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2025-11-25T06:26:00.472998) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.write.latency volume: 334847674 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.473 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2025-11-25T06:26:00.474026) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.read.latency volume: 196050464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.read.latency volume: 20941723 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.474 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2025-11-25T06:26:00.475035) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.475 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2025-11-25T06:26:00.475739) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/network.outgoing.bytes volume: 8266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2025-11-25T06:26:00.476554) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.476 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2025-11-25T06:26:00.477346) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.477 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2025-11-25T06:26:00.478409) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.478 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2025-11-25T06:26:00.479499) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.read.bytes volume: 29518336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.479 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.480 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.480 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.480 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.480 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.480 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.480 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.480 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2025-11-25T06:26:00.480632) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2025-11-25T06:26:00.481372) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.481 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2025-11-25T06:26:00.482401) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 DEBUG ceilometer.compute.pollsters [-] 90a703a7-09d1-4f58-84e5-80f4083b5922/memory.usage volume: 42.7109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:26:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:26:00.482 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.235 186245 DEBUG nova.network.neutron [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.423 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.738 186245 DEBUG oslo_concurrency.lockutils [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.740 186245 DEBUG nova.virt.libvirt.vif [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:25:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:25:21Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.741 186245 DEBUG nova.network.os_vif_util [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.741 186245 DEBUG nova.network.os_vif_util [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.742 186245 DEBUG os_vif [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.742 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.742 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.743 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.743 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.743 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1ae79ecb-9aa8-50b1-bd10-f66f0c89952f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.744 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.746 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.748 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.748 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d3aa3ad-5f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.749 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6d3aa3ad-5f, col_values=(('qos', UUID('3158fd8a-ea6e-482d-9df4-9c475095009c')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.749 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6d3aa3ad-5f, col_values=(('external_ids', {'iface-id': '6d3aa3ad-5f04-4c0f-bc86-9242dc134214', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:cf:0b', 'vm-uuid': '90a703a7-09d1-4f58-84e5-80f4083b5922'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 NetworkManager[55345]: <info>  [1764051961.7506] manager: (tap6d3aa3ad-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.750 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.753 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.754 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.755 186245 INFO os_vif [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f')
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.755 186245 DEBUG nova.virt.libvirt.vif [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:25:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:25:21Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.756 186245 DEBUG nova.network.os_vif_util [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.756 186245 DEBUG nova.network.os_vif_util [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.757 186245 DEBUG nova.virt.libvirt.guest [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] attach device xml: <interface type="ethernet">
Nov 25 06:26:01 compute-0 nova_compute[186241]:   <mac address="fa:16:3e:15:cf:0b"/>
Nov 25 06:26:01 compute-0 nova_compute[186241]:   <model type="virtio"/>
Nov 25 06:26:01 compute-0 nova_compute[186241]:   <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:26:01 compute-0 nova_compute[186241]:   <mtu size="1442"/>
Nov 25 06:26:01 compute-0 nova_compute[186241]:   <target dev="tap6d3aa3ad-5f"/>
Nov 25 06:26:01 compute-0 nova_compute[186241]: </interface>
Nov 25 06:26:01 compute-0 nova_compute[186241]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:336
Nov 25 06:26:01 compute-0 kernel: tap6d3aa3ad-5f: entered promiscuous mode
Nov 25 06:26:01 compute-0 NetworkManager[55345]: <info>  [1764051961.7652] manager: (tap6d3aa3ad-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Nov 25 06:26:01 compute-0 ovn_controller[95135]: 2025-11-25T06:26:01Z|00103|binding|INFO|Claiming lport 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 for this chassis.
Nov 25 06:26:01 compute-0 ovn_controller[95135]: 2025-11-25T06:26:01Z|00104|binding|INFO|6d3aa3ad-5f04-4c0f-bc86-9242dc134214: Claiming fa:16:3e:15:cf:0b 10.100.0.21
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.769 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.773 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:cf:0b 10.100.0.21'], port_security=['fa:16:3e:15:cf:0b 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '90a703a7-09d1-4f58-84e5-80f4083b5922', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7764c441-3630-43ef-a835-62532c499c69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dafbde12-3514-4e2d-980f-9529576187d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=376a7dc6-ccc2-4ff5-9992-66bd605dbeaf, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=6d3aa3ad-5f04-4c0f-bc86-9242dc134214) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.774 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 in datapath 7764c441-3630-43ef-a835-62532c499c69 bound to our chassis
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.775 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7764c441-3630-43ef-a835-62532c499c69
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.783 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3461080b-3227-4e32-a8c2-c7b9f41116f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.783 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7764c441-31 in ovnmeta-7764c441-3630-43ef-a835-62532c499c69 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.785 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7764c441-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.785 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f064b676-6287-4ae1-8db6-26e46058d774]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.786 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[061fe333-3053-4c07-8181-6956cd92152e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.794 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[94563dd2-53a6-47e4-a1cd-4e610d7ebb4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 systemd-udevd[214924]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:26:01 compute-0 NetworkManager[55345]: <info>  [1764051961.8069] device (tap6d3aa3ad-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:26:01 compute-0 NetworkManager[55345]: <info>  [1764051961.8078] device (tap6d3aa3ad-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.815 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 ovn_controller[95135]: 2025-11-25T06:26:01Z|00105|binding|INFO|Setting lport 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 ovn-installed in OVS
Nov 25 06:26:01 compute-0 ovn_controller[95135]: 2025-11-25T06:26:01Z|00106|binding|INFO|Setting lport 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 up in Southbound
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.817 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.819 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[13ab408a-ca24-41c8-b893-db22cdf13fec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.848 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[91003c8c-daae-4b3e-bfc5-58888997f1e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 NetworkManager[55345]: <info>  [1764051961.8531] manager: (tap7764c441-30): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.853 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a6021959-58c6-4d30-8080-bc94bb95ec90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.876 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[f29a1a01-264a-43d7-b15d-e73caf6eaf54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.877 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[b04083d8-6ab1-4698-b13d-7de7de3346bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 NetworkManager[55345]: <info>  [1764051961.8934] device (tap7764c441-30): carrier: link connected
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.896 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b06b40-bf6d-47e0-9d55-d5569e6579c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.909 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[0baf599b-ea0f-44a2-9589-9bad3196fb12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7764c441-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:09:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 295029, 'reachable_time': 23204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214941, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.919 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c32d8b84-692e-4f08-8c50-820c9f9fe290]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe62:9f3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 295029, 'tstamp': 295029}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214942, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.930 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc336cb-eaa1-4f7b-a3aa-4a67878b9820]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7764c441-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:09:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 295029, 'reachable_time': 23204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214943, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.949 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[49bed224-c095-4790-8967-24af81203a12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.985 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.990 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[149cd58f-cc40-45c7-8be9-0c5f14f5c64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.991 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7764c441-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.991 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.991 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7764c441-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 NetworkManager[55345]: <info>  [1764051961.9936] manager: (tap7764c441-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 25 06:26:01 compute-0 kernel: tap7764c441-30: entered promiscuous mode
Nov 25 06:26:01 compute-0 nova_compute[186241]: 2025-11-25 06:26:01.994 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:01.995 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7764c441-30, col_values=(('external_ids', {'iface-id': 'd819e567-57aa-4c38-852c-35e41fc7980c'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:01 compute-0 ovn_controller[95135]: 2025-11-25T06:26:01Z|00107|binding|INFO|Releasing lport d819e567-57aa-4c38-852c-35e41fc7980c from this chassis (sb_readonly=0)
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.008 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.009 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a3875c7b-ca9c-4bdf-a5e9-b83c1d4fcc06]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.009 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.010 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.010 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 7764c441-3630-43ef-a835-62532c499c69 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.010 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.010 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[27f2f430-3c37-435e-aaa4-6ab40c5899f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.011 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.011 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[457bb1f2-0f99-473b-a896-e021e93c2b81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.011 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-7764c441-3630-43ef-a835-62532c499c69
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 7764c441-3630-43ef-a835-62532c499c69
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.012 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'env', 'PROCESS_TAG=haproxy-7764c441-3630-43ef-a835-62532c499c69', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7764c441-3630-43ef-a835-62532c499c69.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:26:02 compute-0 podman[214971]: 2025-11-25 06:26:02.299889918 +0000 UTC m=+0.032754190 container create e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 06:26:02 compute-0 systemd[1]: Started libpod-conmon-e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7.scope.
Nov 25 06:26:02 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:26:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/609753f362590549c07227ded02779f17ed025514ec7ac42fc942ab8b7069bd7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:26:02 compute-0 podman[214971]: 2025-11-25 06:26:02.353577202 +0000 UTC m=+0.086441493 container init e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:26:02 compute-0 podman[214971]: 2025-11-25 06:26:02.358272613 +0000 UTC m=+0.091136885 container start e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 06:26:02 compute-0 podman[214971]: 2025-11-25 06:26:02.284832457 +0000 UTC m=+0.017696748 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:26:02 compute-0 neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69[214995]: [NOTICE]   (215016) : New worker (215028) forked
Nov 25 06:26:02 compute-0 neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69[214995]: [NOTICE]   (215016) : Loading success.
Nov 25 06:26:02 compute-0 podman[214981]: 2025-11-25 06:26:02.396733907 +0000 UTC m=+0.070065751 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 06:26:02 compute-0 podman[214984]: 2025-11-25 06:26:02.398238747 +0000 UTC m=+0.072468861 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.756 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.756 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:26:02 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:02.758 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.854 186245 DEBUG nova.compute.manager [req-a9747edb-bb75-4450-9c6f-4d3731d6be40 req-4bede13c-97a9-4eb1-93f1-8802b215cc3b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.854 186245 DEBUG oslo_concurrency.lockutils [req-a9747edb-bb75-4450-9c6f-4d3731d6be40 req-4bede13c-97a9-4eb1-93f1-8802b215cc3b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.854 186245 DEBUG oslo_concurrency.lockutils [req-a9747edb-bb75-4450-9c6f-4d3731d6be40 req-4bede13c-97a9-4eb1-93f1-8802b215cc3b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.855 186245 DEBUG oslo_concurrency.lockutils [req-a9747edb-bb75-4450-9c6f-4d3731d6be40 req-4bede13c-97a9-4eb1-93f1-8802b215cc3b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.855 186245 DEBUG nova.compute.manager [req-a9747edb-bb75-4450-9c6f-4d3731d6be40 req-4bede13c-97a9-4eb1-93f1-8802b215cc3b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] No waiting events found dispatching network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:26:02 compute-0 nova_compute[186241]: 2025-11-25 06:26:02.855 186245 WARNING nova.compute.manager [req-a9747edb-bb75-4450-9c6f-4d3731d6be40 req-4bede13c-97a9-4eb1-93f1-8802b215cc3b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received unexpected event network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 for instance with vm_state active and task_state None.
Nov 25 06:26:02 compute-0 ovn_controller[95135]: 2025-11-25T06:26:02Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:cf:0b 10.100.0.21
Nov 25 06:26:02 compute-0 ovn_controller[95135]: 2025-11-25T06:26:02Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:cf:0b 10.100.0.21
Nov 25 06:26:03 compute-0 nova_compute[186241]: 2025-11-25 06:26:03.332 186245 DEBUG nova.virt.libvirt.driver [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:26:03 compute-0 nova_compute[186241]: 2025-11-25 06:26:03.332 186245 DEBUG nova.virt.libvirt.driver [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:26:03 compute-0 nova_compute[186241]: 2025-11-25 06:26:03.332 186245 DEBUG nova.virt.libvirt.driver [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:a2:a7:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:26:03 compute-0 nova_compute[186241]: 2025-11-25 06:26:03.333 186245 DEBUG nova.virt.libvirt.driver [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:15:cf:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:26:03 compute-0 nova_compute[186241]: 2025-11-25 06:26:03.837 186245 DEBUG nova.virt.driver [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-2139323515', uuid='90a703a7-09d1-4f58-84e5-80f4083b5922'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051963.8376725) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:26:03 compute-0 nova_compute[186241]: 2025-11-25 06:26:03.838 186245 DEBUG nova.virt.libvirt.guest [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:26:03 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:26:03</nova:creationTime>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:26:03 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     <nova:port uuid="6d3aa3ad-5f04-4c0f-bc86-9242dc134214">
Nov 25 06:26:03 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 25 06:26:03 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:26:03 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:26:03 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:26:03 compute-0 nova_compute[186241]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Nov 25 06:26:04 compute-0 nova_compute[186241]: 2025-11-25 06:26:04.344 186245 DEBUG oslo_concurrency.lockutils [None req-9d9c5978-efc1-4c68-b03a-7e8f6e137a30 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-90a703a7-09d1-4f58-84e5-80f4083b5922-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 19.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:05 compute-0 nova_compute[186241]: 2025-11-25 06:26:05.016 186245 DEBUG nova.compute.manager [req-ed924463-542a-406f-9fe5-5ab77b3269ab req-caa3448e-8e96-4477-9730-22588fc75d42 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:26:05 compute-0 nova_compute[186241]: 2025-11-25 06:26:05.017 186245 DEBUG oslo_concurrency.lockutils [req-ed924463-542a-406f-9fe5-5ab77b3269ab req-caa3448e-8e96-4477-9730-22588fc75d42 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:05 compute-0 nova_compute[186241]: 2025-11-25 06:26:05.017 186245 DEBUG oslo_concurrency.lockutils [req-ed924463-542a-406f-9fe5-5ab77b3269ab req-caa3448e-8e96-4477-9730-22588fc75d42 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:05 compute-0 nova_compute[186241]: 2025-11-25 06:26:05.017 186245 DEBUG oslo_concurrency.lockutils [req-ed924463-542a-406f-9fe5-5ab77b3269ab req-caa3448e-8e96-4477-9730-22588fc75d42 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:05 compute-0 nova_compute[186241]: 2025-11-25 06:26:05.017 186245 DEBUG nova.compute.manager [req-ed924463-542a-406f-9fe5-5ab77b3269ab req-caa3448e-8e96-4477-9730-22588fc75d42 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] No waiting events found dispatching network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:26:05 compute-0 nova_compute[186241]: 2025-11-25 06:26:05.017 186245 WARNING nova.compute.manager [req-ed924463-542a-406f-9fe5-5ab77b3269ab req-caa3448e-8e96-4477-9730-22588fc75d42 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received unexpected event network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 for instance with vm_state active and task_state None.
Nov 25 06:26:06 compute-0 podman[215035]: 2025-11-25 06:26:06.063016013 +0000 UTC m=+0.039277673 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 06:26:06 compute-0 nova_compute[186241]: 2025-11-25 06:26:06.750 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:06 compute-0 nova_compute[186241]: 2025-11-25 06:26:06.988 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:11 compute-0 podman[215051]: 2025-11-25 06:26:11.064012278 +0000 UTC m=+0.042614829 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, architecture=x86_64)
Nov 25 06:26:11 compute-0 nova_compute[186241]: 2025-11-25 06:26:11.754 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:11 compute-0 nova_compute[186241]: 2025-11-25 06:26:11.989 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:12 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:12.759 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:16 compute-0 nova_compute[186241]: 2025-11-25 06:26:16.758 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:16 compute-0 nova_compute[186241]: 2025-11-25 06:26:16.990 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:17 compute-0 podman[215069]: 2025-11-25 06:26:17.065108227 +0000 UTC m=+0.043244757 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 25 06:26:18 compute-0 nova_compute[186241]: 2025-11-25 06:26:18.397 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:18 compute-0 nova_compute[186241]: 2025-11-25 06:26:18.397 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:18 compute-0 nova_compute[186241]: 2025-11-25 06:26:18.900 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:26:19 compute-0 nova_compute[186241]: 2025-11-25 06:26:19.589 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:19 compute-0 nova_compute[186241]: 2025-11-25 06:26:19.590 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:19 compute-0 nova_compute[186241]: 2025-11-25 06:26:19.596 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:26:19 compute-0 nova_compute[186241]: 2025-11-25 06:26:19.597 186245 INFO nova.compute.claims [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:26:20 compute-0 nova_compute[186241]: 2025-11-25 06:26:20.950 186245 DEBUG nova.compute.provider_tree [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:26:21 compute-0 podman[215087]: 2025-11-25 06:26:21.060952993 +0000 UTC m=+0.038154419 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:26:21 compute-0 nova_compute[186241]: 2025-11-25 06:26:21.438 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:21 compute-0 nova_compute[186241]: 2025-11-25 06:26:21.438 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:21 compute-0 nova_compute[186241]: 2025-11-25 06:26:21.454 186245 DEBUG nova.scheduler.client.report [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:26:21 compute-0 nova_compute[186241]: 2025-11-25 06:26:21.762 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:21 compute-0 nova_compute[186241]: 2025-11-25 06:26:21.959 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:21 compute-0 nova_compute[186241]: 2025-11-25 06:26:21.960 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:26:21 compute-0 nova_compute[186241]: 2025-11-25 06:26:21.991 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:22 compute-0 nova_compute[186241]: 2025-11-25 06:26:22.465 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:26:22 compute-0 nova_compute[186241]: 2025-11-25 06:26:22.465 186245 DEBUG nova.network.neutron [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:26:22 compute-0 nova_compute[186241]: 2025-11-25 06:26:22.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:22 compute-0 nova_compute[186241]: 2025-11-25 06:26:22.970 186245 INFO nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:26:23 compute-0 nova_compute[186241]: 2025-11-25 06:26:23.236 186245 DEBUG nova.policy [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:26:23 compute-0 nova_compute[186241]: 2025-11-25 06:26:23.474 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:26:23 compute-0 nova_compute[186241]: 2025-11-25 06:26:23.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:23 compute-0 nova_compute[186241]: 2025-11-25 06:26:23.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:23 compute-0 nova_compute[186241]: 2025-11-25 06:26:23.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:26:23 compute-0 nova_compute[186241]: 2025-11-25 06:26:23.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.441 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.483 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.484 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.484 186245 INFO nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Creating image(s)
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.485 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.485 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.486 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.486 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.489 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.490 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.534 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.535 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.535 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.535 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.538 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.539 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.551 186245 DEBUG nova.network.neutron [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Successfully created port: 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.582 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.582 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.601 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.602 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.602 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.645 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.646 186245 DEBUG nova.virt.disk.api [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.646 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.689 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.690 186245 DEBUG nova.virt.disk.api [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.690 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.691 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Ensure instance console log exists: /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.691 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.691 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:24 compute-0 nova_compute[186241]: 2025-11-25 06:26:24.691 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.450 186245 DEBUG nova.network.neutron [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Successfully updated port: 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.464 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.511 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.511 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.557 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.648 186245 DEBUG nova.compute.manager [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received event network-changed-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.648 186245 DEBUG nova.compute.manager [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Refreshing instance network info cache due to event network-changed-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.648 186245 DEBUG oslo_concurrency.lockutils [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-a5429f72-73a0-4ab5-be90-931d49a7de1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.649 186245 DEBUG oslo_concurrency.lockutils [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-a5429f72-73a0-4ab5-be90-931d49a7de1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.649 186245 DEBUG nova.network.neutron [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Refreshing network info cache for port 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.808 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.809 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=72.99283218383789GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.809 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.809 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:25 compute-0 nova_compute[186241]: 2025-11-25 06:26:25.953 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-a5429f72-73a0-4ab5-be90-931d49a7de1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:26:26 compute-0 nova_compute[186241]: 2025-11-25 06:26:26.764 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:26 compute-0 nova_compute[186241]: 2025-11-25 06:26:26.851 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 90a703a7-09d1-4f58-84e5-80f4083b5922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:26:26 compute-0 nova_compute[186241]: 2025-11-25 06:26:26.852 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance a5429f72-73a0-4ab5-be90-931d49a7de1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:26:26 compute-0 nova_compute[186241]: 2025-11-25 06:26:26.852 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:26:26 compute-0 nova_compute[186241]: 2025-11-25 06:26:26.852 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:26:26 compute-0 nova_compute[186241]: 2025-11-25 06:26:26.910 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:26:26 compute-0 nova_compute[186241]: 2025-11-25 06:26:26.992 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:27 compute-0 nova_compute[186241]: 2025-11-25 06:26:27.245 186245 DEBUG nova.network.neutron [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:26:27 compute-0 nova_compute[186241]: 2025-11-25 06:26:27.415 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:26:27 compute-0 nova_compute[186241]: 2025-11-25 06:26:27.921 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:26:27 compute-0 nova_compute[186241]: 2025-11-25 06:26:27.921 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:28 compute-0 nova_compute[186241]: 2025-11-25 06:26:28.415 186245 DEBUG nova.network.neutron [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:26:28 compute-0 nova_compute[186241]: 2025-11-25 06:26:28.918 186245 DEBUG oslo_concurrency.lockutils [req-29a5154e-417d-494b-8285-3fd50deb8a18 req-007ecddf-670c-4fcb-9d12-cc05a0615757 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-a5429f72-73a0-4ab5-be90-931d49a7de1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:26:28 compute-0 nova_compute[186241]: 2025-11-25 06:26:28.919 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-a5429f72-73a0-4ab5-be90-931d49a7de1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:26:28 compute-0 nova_compute[186241]: 2025-11-25 06:26:28.919 186245 DEBUG nova.network.neutron [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:26:29 compute-0 podman[215130]: 2025-11-25 06:26:29.086932854 +0000 UTC m=+0.062115852 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 25 06:26:29 compute-0 nova_compute[186241]: 2025-11-25 06:26:29.917 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:29 compute-0 nova_compute[186241]: 2025-11-25 06:26:29.918 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:26:30 compute-0 nova_compute[186241]: 2025-11-25 06:26:30.116 186245 DEBUG nova.network.neutron [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:26:31 compute-0 nova_compute[186241]: 2025-11-25 06:26:31.767 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:31 compute-0 nova_compute[186241]: 2025-11-25 06:26:31.778 186245 DEBUG nova.network.neutron [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Updating instance_info_cache with network_info: [{"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:26:31 compute-0 nova_compute[186241]: 2025-11-25 06:26:31.993 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.281 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-a5429f72-73a0-4ab5-be90-931d49a7de1d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.282 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Instance network_info: |[{"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.283 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Start _get_guest_xml network_info=[{"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.286 186245 WARNING nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.286 186245 DEBUG nova.virt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1879170506', uuid='a5429f72-73a0-4ab5-be90-931d49a7de1d'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764051992.286913) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.290 186245 DEBUG nova.virt.libvirt.host [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.290 186245 DEBUG nova.virt.libvirt.host [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.293 186245 DEBUG nova.virt.libvirt.host [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.294 186245 DEBUG nova.virt.libvirt.host [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.294 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.294 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.294 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.295 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.295 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.295 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.295 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.295 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.296 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.296 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.296 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.296 186245 DEBUG nova.virt.hardware [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.298 186245 DEBUG nova.virt.libvirt.vif [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:26:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1879170506',display_name='tempest-TestNetworkBasicOps-server-1879170506',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1879170506',id=7,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGLAKi7VLxRhsPE2r8kNdXF3PlDAAJCMi5ZM64DyUrr5++49H1lnnmy5/O3GRMuWwGEhUb0U4RubgbH+Ry2yNJJKLvppbBhaqpbtQ7/LqbkgdQFuUf6/J5amB98nrsHUfw==',key_name='tempest-TestNetworkBasicOps-1608672233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-mt2mmusq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:26:23Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=a5429f72-73a0-4ab5-be90-931d49a7de1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.299 186245 DEBUG nova.network.os_vif_util [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.299 186245 DEBUG nova.network.os_vif_util [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:91:29,bridge_name='br-int',has_traffic_filtering=True,id=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64197186-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.300 186245 DEBUG nova.objects.instance [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid a5429f72-73a0-4ab5-be90-931d49a7de1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.804 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <uuid>a5429f72-73a0-4ab5-be90-931d49a7de1d</uuid>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <name>instance-00000007</name>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-1879170506</nova:name>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:26:32</nova:creationTime>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         <nova:port uuid="64197186-bbfb-4f6f-97c7-c2e7d81c0ac2">
Nov 25 06:26:32 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <system>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <entry name="serial">a5429f72-73a0-4ab5-be90-931d49a7de1d</entry>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <entry name="uuid">a5429f72-73a0-4ab5-be90-931d49a7de1d</entry>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </system>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <os>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   </os>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <features>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   </features>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk.config"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:a1:91:29"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <target dev="tap64197186-bb"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/console.log" append="off"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <video>
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </video>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:26:32 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:26:32 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:26:32 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:26:32 compute-0 nova_compute[186241]: </domain>
Nov 25 06:26:32 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.805 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Preparing to wait for external event network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.806 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.806 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.806 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.806 186245 DEBUG nova.virt.libvirt.vif [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:26:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1879170506',display_name='tempest-TestNetworkBasicOps-server-1879170506',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1879170506',id=7,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGLAKi7VLxRhsPE2r8kNdXF3PlDAAJCMi5ZM64DyUrr5++49H1lnnmy5/O3GRMuWwGEhUb0U4RubgbH+Ry2yNJJKLvppbBhaqpbtQ7/LqbkgdQFuUf6/J5amB98nrsHUfw==',key_name='tempest-TestNetworkBasicOps-1608672233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-mt2mmusq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:26:23Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=a5429f72-73a0-4ab5-be90-931d49a7de1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.807 186245 DEBUG nova.network.os_vif_util [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.807 186245 DEBUG nova.network.os_vif_util [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:91:29,bridge_name='br-int',has_traffic_filtering=True,id=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64197186-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.807 186245 DEBUG os_vif [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:91:29,bridge_name='br-int',has_traffic_filtering=True,id=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64197186-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.808 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.808 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.808 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.809 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.809 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a17cf986-e13f-5990-b92e-42de6093378e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.810 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.813 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.815 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.815 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap64197186-bb, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.815 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap64197186-bb, col_values=(('qos', UUID('6677d0b7-2200-4515-85ec-8461fe928787')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.815 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap64197186-bb, col_values=(('external_ids', {'iface-id': '64197186-bbfb-4f6f-97c7-c2e7d81c0ac2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:91:29', 'vm-uuid': 'a5429f72-73a0-4ab5-be90-931d49a7de1d'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.816 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:32 compute-0 NetworkManager[55345]: <info>  [1764051992.8173] manager: (tap64197186-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.818 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.820 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:32 compute-0 nova_compute[186241]: 2025-11-25 06:26:32.821 186245 INFO os_vif [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:91:29,bridge_name='br-int',has_traffic_filtering=True,id=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64197186-bb')
Nov 25 06:26:33 compute-0 podman[215155]: 2025-11-25 06:26:33.068004003 +0000 UTC m=+0.043566050 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 06:26:33 compute-0 podman[215156]: 2025-11-25 06:26:33.091023542 +0000 UTC m=+0.064698135 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:26:34 compute-0 nova_compute[186241]: 2025-11-25 06:26:34.341 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:26:34 compute-0 nova_compute[186241]: 2025-11-25 06:26:34.341 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:26:34 compute-0 nova_compute[186241]: 2025-11-25 06:26:34.341 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:a1:91:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:26:34 compute-0 nova_compute[186241]: 2025-11-25 06:26:34.342 186245 INFO nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Using config drive
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.498 186245 INFO nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Creating config drive at /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk.config
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.503 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp7i1dptpu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.619 186245 DEBUG oslo_concurrency.processutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp7i1dptpu" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:26:35 compute-0 NetworkManager[55345]: <info>  [1764051995.6604] manager: (tap64197186-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Nov 25 06:26:35 compute-0 kernel: tap64197186-bb: entered promiscuous mode
Nov 25 06:26:35 compute-0 ovn_controller[95135]: 2025-11-25T06:26:35Z|00108|binding|INFO|Claiming lport 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 for this chassis.
Nov 25 06:26:35 compute-0 ovn_controller[95135]: 2025-11-25T06:26:35Z|00109|binding|INFO|64197186-bbfb-4f6f-97c7-c2e7d81c0ac2: Claiming fa:16:3e:a1:91:29 10.100.0.20
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.663 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.668 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:91:29 10.100.0.20'], port_security=['fa:16:3e:a1:91:29 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'a5429f72-73a0-4ab5-be90-931d49a7de1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7764c441-3630-43ef-a835-62532c499c69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '959f95cc-cb47-461e-b7e7-59cb871f6e80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=376a7dc6-ccc2-4ff5-9992-66bd605dbeaf, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.669 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 in datapath 7764c441-3630-43ef-a835-62532c499c69 bound to our chassis
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.670 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7764c441-3630-43ef-a835-62532c499c69
Nov 25 06:26:35 compute-0 ovn_controller[95135]: 2025-11-25T06:26:35Z|00110|binding|INFO|Setting lport 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 ovn-installed in OVS
Nov 25 06:26:35 compute-0 ovn_controller[95135]: 2025-11-25T06:26:35Z|00111|binding|INFO|Setting lport 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 up in Southbound
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.677 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.685 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3bfd38e9-481d-4477-b4d2-b1693c2ef64e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:35 compute-0 systemd-udevd[215212]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:26:35 compute-0 systemd-machined[152921]: New machine qemu-7-instance-00000007.
Nov 25 06:26:35 compute-0 NetworkManager[55345]: <info>  [1764051995.7073] device (tap64197186-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:26:35 compute-0 NetworkManager[55345]: <info>  [1764051995.7081] device (tap64197186-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:26:35 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.714 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[36bf9e71-ed89-4c21-854a-276b92979018]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.717 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[149d7be1-4c5c-4c49-8326-dc0fb436c421]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.737 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf4f110-8fd5-43c4-92a5-42f53afc5568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.749 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4e0e0c-d25c-4c5e-b014-5cc5aa236729]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7764c441-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:09:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 295029, 'reachable_time': 23204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215223, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.762 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[32cab6b1-523d-4ddc-a89d-4e2241c2daf4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap7764c441-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 295037, 'tstamp': 295037}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215225, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7764c441-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 295039, 'tstamp': 295039}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215225, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.763 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7764c441-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.766 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.766 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7764c441-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.767 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.767 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7764c441-30, col_values=(('external_ids', {'iface-id': 'd819e567-57aa-4c38-852c-35e41fc7980c'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.767 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:26:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:35.768 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[940d8e01-4044-4713-b7a5-7fd98a9b8e60]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-7764c441-3630-43ef-a835-62532c499c69\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 7764c441-3630-43ef-a835-62532c499c69\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.910 186245 DEBUG nova.compute.manager [req-e43f1384-a20f-4990-b4ff-2f6e37972bd5 req-086605fa-a920-4ea6-be6f-144d05c9b2b2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received event network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.910 186245 DEBUG oslo_concurrency.lockutils [req-e43f1384-a20f-4990-b4ff-2f6e37972bd5 req-086605fa-a920-4ea6-be6f-144d05c9b2b2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.910 186245 DEBUG oslo_concurrency.lockutils [req-e43f1384-a20f-4990-b4ff-2f6e37972bd5 req-086605fa-a920-4ea6-be6f-144d05c9b2b2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.911 186245 DEBUG oslo_concurrency.lockutils [req-e43f1384-a20f-4990-b4ff-2f6e37972bd5 req-086605fa-a920-4ea6-be6f-144d05c9b2b2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:35 compute-0 nova_compute[186241]: 2025-11-25 06:26:35.911 186245 DEBUG nova.compute.manager [req-e43f1384-a20f-4990-b4ff-2f6e37972bd5 req-086605fa-a920-4ea6-be6f-144d05c9b2b2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Processing event network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.027 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.029 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.031 186245 INFO nova.virt.libvirt.driver [-] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Instance spawned successfully.
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.031 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.539 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.540 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.540 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.540 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.540 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.541 186245 DEBUG nova.virt.libvirt.driver [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:26:36 compute-0 nova_compute[186241]: 2025-11-25 06:26:36.996 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:37 compute-0 nova_compute[186241]: 2025-11-25 06:26:37.047 186245 INFO nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Took 12.56 seconds to spawn the instance on the hypervisor.
Nov 25 06:26:37 compute-0 nova_compute[186241]: 2025-11-25 06:26:37.047 186245 DEBUG nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:26:37 compute-0 podman[215234]: 2025-11-25 06:26:37.075972844 +0000 UTC m=+0.049980072 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:26:37 compute-0 nova_compute[186241]: 2025-11-25 06:26:37.564 186245 INFO nova.compute.manager [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Took 18.16 seconds to build instance.
Nov 25 06:26:37 compute-0 nova_compute[186241]: 2025-11-25 06:26:37.816 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:38 compute-0 nova_compute[186241]: 2025-11-25 06:26:38.066 186245 DEBUG oslo_concurrency.lockutils [None req-cb928020-62c9-430b-9958-d45815489bb4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:38 compute-0 nova_compute[186241]: 2025-11-25 06:26:38.080 186245 DEBUG nova.compute.manager [req-2bbcd662-3eed-422e-9b50-9a82edae455c req-4bf3a567-5522-4b78-99ee-33e40f794fd5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received event network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:26:38 compute-0 nova_compute[186241]: 2025-11-25 06:26:38.080 186245 DEBUG oslo_concurrency.lockutils [req-2bbcd662-3eed-422e-9b50-9a82edae455c req-4bf3a567-5522-4b78-99ee-33e40f794fd5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:38 compute-0 nova_compute[186241]: 2025-11-25 06:26:38.080 186245 DEBUG oslo_concurrency.lockutils [req-2bbcd662-3eed-422e-9b50-9a82edae455c req-4bf3a567-5522-4b78-99ee-33e40f794fd5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:38 compute-0 nova_compute[186241]: 2025-11-25 06:26:38.080 186245 DEBUG oslo_concurrency.lockutils [req-2bbcd662-3eed-422e-9b50-9a82edae455c req-4bf3a567-5522-4b78-99ee-33e40f794fd5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:38 compute-0 nova_compute[186241]: 2025-11-25 06:26:38.081 186245 DEBUG nova.compute.manager [req-2bbcd662-3eed-422e-9b50-9a82edae455c req-4bf3a567-5522-4b78-99ee-33e40f794fd5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] No waiting events found dispatching network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:26:38 compute-0 nova_compute[186241]: 2025-11-25 06:26:38.081 186245 WARNING nova.compute.manager [req-2bbcd662-3eed-422e-9b50-9a82edae455c req-4bf3a567-5522-4b78-99ee-33e40f794fd5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received unexpected event network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 for instance with vm_state active and task_state None.
Nov 25 06:26:41 compute-0 nova_compute[186241]: 2025-11-25 06:26:41.997 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:42 compute-0 podman[215250]: 2025-11-25 06:26:42.058484081 +0000 UTC m=+0.038376034 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, distribution-scope=public, version=9.6, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 25 06:26:42 compute-0 nova_compute[186241]: 2025-11-25 06:26:42.817 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:46 compute-0 ovn_controller[95135]: 2025-11-25T06:26:46Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a1:91:29 10.100.0.20
Nov 25 06:26:46 compute-0 ovn_controller[95135]: 2025-11-25T06:26:46Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a1:91:29 10.100.0.20
Nov 25 06:26:46 compute-0 nova_compute[186241]: 2025-11-25 06:26:46.999 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:47.587 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:26:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:47.588 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:26:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:26:47.588 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:26:47 compute-0 nova_compute[186241]: 2025-11-25 06:26:47.819 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:48 compute-0 podman[215286]: 2025-11-25 06:26:48.061079233 +0000 UTC m=+0.039130787 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 25 06:26:52 compute-0 nova_compute[186241]: 2025-11-25 06:26:52.000 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:52 compute-0 podman[215304]: 2025-11-25 06:26:52.062063049 +0000 UTC m=+0.039317799 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:26:52 compute-0 nova_compute[186241]: 2025-11-25 06:26:52.820 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:55 compute-0 nova_compute[186241]: 2025-11-25 06:26:55.447 186245 DEBUG nova.compute.manager [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-changed-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:26:55 compute-0 nova_compute[186241]: 2025-11-25 06:26:55.447 186245 DEBUG nova.compute.manager [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing instance network info cache due to event network-changed-6d3aa3ad-5f04-4c0f-bc86-9242dc134214. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:26:55 compute-0 nova_compute[186241]: 2025-11-25 06:26:55.448 186245 DEBUG oslo_concurrency.lockutils [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:26:55 compute-0 nova_compute[186241]: 2025-11-25 06:26:55.448 186245 DEBUG oslo_concurrency.lockutils [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:26:55 compute-0 nova_compute[186241]: 2025-11-25 06:26:55.448 186245 DEBUG nova.network.neutron [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing network info cache for port 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:26:57 compute-0 nova_compute[186241]: 2025-11-25 06:26:57.002 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:26:57 compute-0 nova_compute[186241]: 2025-11-25 06:26:57.821 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:00 compute-0 podman[215325]: 2025-11-25 06:27:00.110997437 +0000 UTC m=+0.083921926 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.343 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.343 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.343 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.343 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.343 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.344 186245 INFO nova.compute.manager [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Terminating instance
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.419 186245 DEBUG nova.network.neutron [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updated VIF entry in instance network info cache for port 6d3aa3ad-5f04-4c0f-bc86-9242dc134214. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.419 186245 DEBUG nova.network.neutron [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.848 186245 DEBUG nova.compute.manager [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:27:00 compute-0 kernel: tap64197186-bb (unregistering): left promiscuous mode
Nov 25 06:27:00 compute-0 NetworkManager[55345]: <info>  [1764052020.8699] device (tap64197186-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.877 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:00 compute-0 ovn_controller[95135]: 2025-11-25T06:27:00Z|00112|binding|INFO|Releasing lport 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 from this chassis (sb_readonly=0)
Nov 25 06:27:00 compute-0 ovn_controller[95135]: 2025-11-25T06:27:00Z|00113|binding|INFO|Setting lport 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 down in Southbound
Nov 25 06:27:00 compute-0 ovn_controller[95135]: 2025-11-25T06:27:00Z|00114|binding|INFO|Removing iface tap64197186-bb ovn-installed in OVS
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.880 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.883 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:91:29 10.100.0.20'], port_security=['fa:16:3e:a1:91:29 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'a5429f72-73a0-4ab5-be90-931d49a7de1d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7764c441-3630-43ef-a835-62532c499c69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '959f95cc-cb47-461e-b7e7-59cb871f6e80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=376a7dc6-ccc2-4ff5-9992-66bd605dbeaf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.884 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 in datapath 7764c441-3630-43ef-a835-62532c499c69 unbound from our chassis
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.884 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7764c441-3630-43ef-a835-62532c499c69
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.892 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.896 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b3243af2-e168-4a27-9145-6427d2fc077c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:00 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 25 06:27:00 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 10.473s CPU time.
Nov 25 06:27:00 compute-0 systemd-machined[152921]: Machine qemu-7-instance-00000007 terminated.
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.918 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[49677322-a299-409d-a366-e07f61ae335a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.920 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[b862ddec-33f3-4e42-b8a7-07941dd995c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.922 186245 DEBUG oslo_concurrency.lockutils [req-a038e869-262e-4915-8d6a-357a76576fe1 req-e45724ba-5f1e-4363-b03e-29c341ac9d4d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.938 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[17806328-1098-4e3b-9468-d0075fe7d43c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.950 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f9660eed-c38f-4d5f-8a01-5920d8cd28fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7764c441-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:09:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 8, 'rx_bytes': 790, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 8, 'rx_bytes': 790, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 295029, 'reachable_time': 19680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 7, 'inoctets': 524, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 524, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215360, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.961 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5de48ce4-84ef-459f-bec6-ee6fb9cf97f3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap7764c441-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 295037, 'tstamp': 295037}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215361, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7764c441-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 295039, 'tstamp': 295039}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215361, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.962 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7764c441-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.962 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:00 compute-0 nova_compute[186241]: 2025-11-25 06:27:00.965 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.966 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7764c441-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.966 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.966 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7764c441-30, col_values=(('external_ids', {'iface-id': 'd819e567-57aa-4c38-852c-35e41fc7980c'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.967 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:27:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:00.967 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf596b6-60ae-45ff-8fa0-72dd603e687a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-7764c441-3630-43ef-a835-62532c499c69\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 7764c441-3630-43ef-a835-62532c499c69\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.060 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.063 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.083 186245 INFO nova.virt.libvirt.driver [-] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Instance destroyed successfully.
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.084 186245 DEBUG nova.objects.instance [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid a5429f72-73a0-4ab5-be90-931d49a7de1d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.405 186245 DEBUG nova.compute.manager [req-74f75294-f5dc-46f0-8616-5cc4d3dcd8b9 req-53b9c72d-1d89-468f-8264-cd923b79e157 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received event network-vif-unplugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.405 186245 DEBUG oslo_concurrency.lockutils [req-74f75294-f5dc-46f0-8616-5cc4d3dcd8b9 req-53b9c72d-1d89-468f-8264-cd923b79e157 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.405 186245 DEBUG oslo_concurrency.lockutils [req-74f75294-f5dc-46f0-8616-5cc4d3dcd8b9 req-53b9c72d-1d89-468f-8264-cd923b79e157 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.405 186245 DEBUG oslo_concurrency.lockutils [req-74f75294-f5dc-46f0-8616-5cc4d3dcd8b9 req-53b9c72d-1d89-468f-8264-cd923b79e157 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.405 186245 DEBUG nova.compute.manager [req-74f75294-f5dc-46f0-8616-5cc4d3dcd8b9 req-53b9c72d-1d89-468f-8264-cd923b79e157 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] No waiting events found dispatching network-vif-unplugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.406 186245 DEBUG nova.compute.manager [req-74f75294-f5dc-46f0-8616-5cc4d3dcd8b9 req-53b9c72d-1d89-468f-8264-cd923b79e157 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received event network-vif-unplugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.587 186245 DEBUG nova.virt.libvirt.vif [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:26:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1879170506',display_name='tempest-TestNetworkBasicOps-server-1879170506',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1879170506',id=7,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGLAKi7VLxRhsPE2r8kNdXF3PlDAAJCMi5ZM64DyUrr5++49H1lnnmy5/O3GRMuWwGEhUb0U4RubgbH+Ry2yNJJKLvppbBhaqpbtQ7/LqbkgdQFuUf6/J5amB98nrsHUfw==',key_name='tempest-TestNetworkBasicOps-1608672233',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:26:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-mt2mmusq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:26:37Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=a5429f72-73a0-4ab5-be90-931d49a7de1d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.588 186245 DEBUG nova.network.os_vif_util [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "address": "fa:16:3e:a1:91:29", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap64197186-bb", "ovs_interfaceid": "64197186-bbfb-4f6f-97c7-c2e7d81c0ac2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.588 186245 DEBUG nova.network.os_vif_util [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:91:29,bridge_name='br-int',has_traffic_filtering=True,id=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64197186-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.589 186245 DEBUG os_vif [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:91:29,bridge_name='br-int',has_traffic_filtering=True,id=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64197186-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.590 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.590 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap64197186-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.593 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.594 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.594 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=6677d0b7-2200-4515-85ec-8461fe928787) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.595 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.595 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.596 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.598 186245 INFO os_vif [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:91:29,bridge_name='br-int',has_traffic_filtering=True,id=64197186-bbfb-4f6f-97c7-c2e7d81c0ac2,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64197186-bb')
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.598 186245 INFO nova.virt.libvirt.driver [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Deleting instance files /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d_del
Nov 25 06:27:01 compute-0 nova_compute[186241]: 2025-11-25 06:27:01.599 186245 INFO nova.virt.libvirt.driver [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Deletion of /var/lib/nova/instances/a5429f72-73a0-4ab5-be90-931d49a7de1d_del complete
Nov 25 06:27:02 compute-0 nova_compute[186241]: 2025-11-25 06:27:02.003 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:02 compute-0 nova_compute[186241]: 2025-11-25 06:27:02.105 186245 INFO nova.compute.manager [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:27:02 compute-0 nova_compute[186241]: 2025-11-25 06:27:02.106 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:27:02 compute-0 nova_compute[186241]: 2025-11-25 06:27:02.106 186245 DEBUG nova.compute.manager [-] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:27:02 compute-0 nova_compute[186241]: 2025-11-25 06:27:02.106 186245 DEBUG nova.network.neutron [-] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.321 186245 DEBUG nova.network.neutron [-] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.592 186245 DEBUG nova.compute.manager [req-b1398ae8-f76d-4f9d-abc7-ee222d78ee0e req-aef01a9e-3631-4f59-9621-7db53babbccb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received event network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.592 186245 DEBUG oslo_concurrency.lockutils [req-b1398ae8-f76d-4f9d-abc7-ee222d78ee0e req-aef01a9e-3631-4f59-9621-7db53babbccb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.592 186245 DEBUG oslo_concurrency.lockutils [req-b1398ae8-f76d-4f9d-abc7-ee222d78ee0e req-aef01a9e-3631-4f59-9621-7db53babbccb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.593 186245 DEBUG oslo_concurrency.lockutils [req-b1398ae8-f76d-4f9d-abc7-ee222d78ee0e req-aef01a9e-3631-4f59-9621-7db53babbccb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.593 186245 DEBUG nova.compute.manager [req-b1398ae8-f76d-4f9d-abc7-ee222d78ee0e req-aef01a9e-3631-4f59-9621-7db53babbccb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] No waiting events found dispatching network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.593 186245 WARNING nova.compute.manager [req-b1398ae8-f76d-4f9d-abc7-ee222d78ee0e req-aef01a9e-3631-4f59-9621-7db53babbccb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received unexpected event network-vif-plugged-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 for instance with vm_state active and task_state deleting.
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.593 186245 DEBUG nova.compute.manager [req-b1398ae8-f76d-4f9d-abc7-ee222d78ee0e req-aef01a9e-3631-4f59-9621-7db53babbccb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Received event network-vif-deleted-64197186-bbfb-4f6f-97c7-c2e7d81c0ac2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:03 compute-0 nova_compute[186241]: 2025-11-25 06:27:03.827 186245 INFO nova.compute.manager [-] [instance: a5429f72-73a0-4ab5-be90-931d49a7de1d] Took 1.72 seconds to deallocate network for instance.
Nov 25 06:27:04 compute-0 podman[215378]: 2025-11-25 06:27:04.059965707 +0000 UTC m=+0.036113454 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:27:04 compute-0 podman[215377]: 2025-11-25 06:27:04.065953416 +0000 UTC m=+0.044096962 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:27:04 compute-0 nova_compute[186241]: 2025-11-25 06:27:04.334 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:04 compute-0 nova_compute[186241]: 2025-11-25 06:27:04.335 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:04 compute-0 nova_compute[186241]: 2025-11-25 06:27:04.398 186245 DEBUG nova.compute.provider_tree [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:27:04 compute-0 nova_compute[186241]: 2025-11-25 06:27:04.903 186245 DEBUG nova.scheduler.client.report [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:27:05 compute-0 nova_compute[186241]: 2025-11-25 06:27:05.408 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:05 compute-0 nova_compute[186241]: 2025-11-25 06:27:05.430 186245 INFO nova.scheduler.client.report [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance a5429f72-73a0-4ab5-be90-931d49a7de1d
Nov 25 06:27:06 compute-0 nova_compute[186241]: 2025-11-25 06:27:06.437 186245 DEBUG oslo_concurrency.lockutils [None req-5664f744-db19-492f-9353-7c2e94c6485f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "a5429f72-73a0-4ab5-be90-931d49a7de1d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:06 compute-0 nova_compute[186241]: 2025-11-25 06:27:06.595 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:07 compute-0 nova_compute[186241]: 2025-11-25 06:27:07.005 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:08 compute-0 podman[215414]: 2025-11-25 06:27:08.06134642 +0000 UTC m=+0.039271653 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:27:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:08.214 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:27:08 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:08.215 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:27:08 compute-0 nova_compute[186241]: 2025-11-25 06:27:08.216 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:09 compute-0 nova_compute[186241]: 2025-11-25 06:27:09.250 186245 DEBUG oslo_concurrency.lockutils [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "interface-90a703a7-09d1-4f58-84e5-80f4083b5922-6d3aa3ad-5f04-4c0f-bc86-9242dc134214" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:09 compute-0 nova_compute[186241]: 2025-11-25 06:27:09.250 186245 DEBUG oslo_concurrency.lockutils [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-90a703a7-09d1-4f58-84e5-80f4083b5922-6d3aa3ad-5f04-4c0f-bc86-9242dc134214" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:09 compute-0 nova_compute[186241]: 2025-11-25 06:27:09.756 186245 DEBUG nova.objects.instance [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'flavor' on Instance uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.260 186245 DEBUG nova.virt.libvirt.vif [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:25:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:25:21Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.260 186245 DEBUG nova.network.os_vif_util [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.261 186245 DEBUG nova.network.os_vif_util [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.263 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.264 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.265 186245 DEBUG nova.virt.libvirt.driver [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Attempting to detach device tap6d3aa3ad-5f from instance 90a703a7-09d1-4f58-84e5-80f4083b5922 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2637
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.266 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] detach device xml: <interface type="ethernet">
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <mac address="fa:16:3e:15:cf:0b"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <model type="virtio"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <mtu size="1442"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <target dev="tap6d3aa3ad-5f"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]: </interface>
Nov 25 06:27:10 compute-0 nova_compute[186241]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.269 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.271 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <name>instance-00000006</name>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <uuid>90a703a7-09d1-4f58-84e5-80f4083b5922</uuid>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:26:03</nova:creationTime>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <nova:port uuid="6d3aa3ad-5f04-4c0f-bc86-9242dc134214">
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:27:10 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <system>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <entry name='serial'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <entry name='uuid'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </system>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <os>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </os>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <features>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </features>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk' index='2'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config' index='1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:a2:a7:44'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target dev='tap83e4beda-0c'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:15:cf:0b'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target dev='tap6d3aa3ad-5f'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='net1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       </target>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </console>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <video>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </video>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c621,c779</label>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c621,c779</imagelabel>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:27:10 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:10 compute-0 nova_compute[186241]: </domain>
Nov 25 06:27:10 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.272 186245 INFO nova.virt.libvirt.driver [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully detached device tap6d3aa3ad-5f from instance 90a703a7-09d1-4f58-84e5-80f4083b5922 from the persistent domain config.
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.273 186245 DEBUG nova.virt.libvirt.driver [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] (1/8): Attempting to detach device tap6d3aa3ad-5f with device alias net1 from instance 90a703a7-09d1-4f58-84e5-80f4083b5922 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2673
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.273 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] detach device xml: <interface type="ethernet">
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <mac address="fa:16:3e:15:cf:0b"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <model type="virtio"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <mtu size="1442"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]:   <target dev="tap6d3aa3ad-5f"/>
Nov 25 06:27:10 compute-0 nova_compute[186241]: </interface>
Nov 25 06:27:10 compute-0 nova_compute[186241]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:466
Nov 25 06:27:10 compute-0 kernel: tap6d3aa3ad-5f (unregistering): left promiscuous mode
Nov 25 06:27:10 compute-0 NetworkManager[55345]: <info>  [1764052030.3604] device (tap6d3aa3ad-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.367 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:10 compute-0 ovn_controller[95135]: 2025-11-25T06:27:10Z|00115|binding|INFO|Releasing lport 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 from this chassis (sb_readonly=0)
Nov 25 06:27:10 compute-0 ovn_controller[95135]: 2025-11-25T06:27:10Z|00116|binding|INFO|Setting lport 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 down in Southbound
Nov 25 06:27:10 compute-0 ovn_controller[95135]: 2025-11-25T06:27:10Z|00117|binding|INFO|Removing iface tap6d3aa3ad-5f ovn-installed in OVS
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.376 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.377 186245 DEBUG nova.virt.libvirt.driver [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Start waiting for the detach event from libvirt for device tap6d3aa3ad-5f with device alias net1 for instance 90a703a7-09d1-4f58-84e5-80f4083b5922 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2749
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.378 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:cf:0b 10.100.0.21', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '90a703a7-09d1-4f58-84e5-80f4083b5922', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7764c441-3630-43ef-a835-62532c499c69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=376a7dc6-ccc2-4ff5-9992-66bd605dbeaf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=6d3aa3ad-5f04-4c0f-bc86-9242dc134214) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.379 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 6d3aa3ad-5f04-4c0f-bc86-9242dc134214 in datapath 7764c441-3630-43ef-a835-62532c499c69 unbound from our chassis
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.380 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7764c441-3630-43ef-a835-62532c499c69, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.381 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcf1eb8-eee2-42b2-8808-1f66eacf99d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.381 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7764c441-3630-43ef-a835-62532c499c69 namespace which is not needed anymore
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.382 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:10 compute-0 neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69[214995]: [NOTICE]   (215016) : haproxy version is 2.8.14-c23fe91
Nov 25 06:27:10 compute-0 podman[215450]: 2025-11-25 06:27:10.463548043 +0000 UTC m=+0.020028725 container kill e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 06:27:10 compute-0 neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69[214995]: [NOTICE]   (215016) : path to executable is /usr/sbin/haproxy
Nov 25 06:27:10 compute-0 neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69[214995]: [WARNING]  (215016) : Exiting Master process...
Nov 25 06:27:10 compute-0 neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69[214995]: [ALERT]    (215016) : Current worker (215028) exited with code 143 (Terminated)
Nov 25 06:27:10 compute-0 neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69[214995]: [WARNING]  (215016) : All workers exited. Exiting... (0)
Nov 25 06:27:10 compute-0 systemd[1]: libpod-e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7.scope: Deactivated successfully.
Nov 25 06:27:10 compute-0 podman[215461]: 2025-11-25 06:27:10.493072218 +0000 UTC m=+0.017601683 container died e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:27:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7-userdata-shm.mount: Deactivated successfully.
Nov 25 06:27:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-609753f362590549c07227ded02779f17ed025514ec7ac42fc942ab8b7069bd7-merged.mount: Deactivated successfully.
Nov 25 06:27:10 compute-0 podman[215461]: 2025-11-25 06:27:10.513274499 +0000 UTC m=+0.037803945 container cleanup e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 06:27:10 compute-0 podman[215463]: 2025-11-25 06:27:10.522242576 +0000 UTC m=+0.040830778 container remove e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 06:27:10 compute-0 systemd[1]: libpod-conmon-e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7.scope: Deactivated successfully.
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.527 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[69c8789b-864b-49b9-a34f-0b2dc3df49f8]: (4, ("Tue Nov 25 06:27:10 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69 (e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7)\ne873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7\nTue Nov 25 06:27:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7764c441-3630-43ef-a835-62532c499c69 (e873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7)\ne873c9d8df76359746d97adf9b4b2171257c0894aea5233dd20367ed634d6fe7\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.528 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8548cbe1-9eb3-48a9-987c-36d5d8a389db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.528 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7764c441-3630-43ef-a835-62532c499c69.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.528 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[dc31778a-3540-494c-bb40-0b9adc1b8ebb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.529 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7764c441-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.532 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:10 compute-0 kernel: tap7764c441-30: left promiscuous mode
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.547 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:10 compute-0 nova_compute[186241]: 2025-11-25 06:27:10.551 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.554 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec52935-d5ee-4f9b-850c-70f218f52c6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.564 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f3acff30-2fcc-4526-8551-2364ef221144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.565 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9046e1d1-084d-4149-96cb-d25018c4683c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.577 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[caa6b00c-7bbc-4b1b-a69e-06cc665128fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 295024, 'reachable_time': 21734, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215489, 'error': None, 'target': 'ovnmeta-7764c441-3630-43ef-a835-62532c499c69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:10 compute-0 systemd[1]: run-netns-ovnmeta\x2d7764c441\x2d3630\x2d43ef\x2da835\x2d62532c499c69.mount: Deactivated successfully.
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.579 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7764c441-3630-43ef-a835-62532c499c69 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:27:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:10.579 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c690e8-2607-4451-8ea7-bdf441751f10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:11 compute-0 nova_compute[186241]: 2025-11-25 06:27:11.432 186245 DEBUG nova.compute.manager [req-f5cac4b4-bc17-4e00-bc54-215de2a45b63 req-91104998-afdf-4620-bae9-8de169af51cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-unplugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:11 compute-0 nova_compute[186241]: 2025-11-25 06:27:11.432 186245 DEBUG oslo_concurrency.lockutils [req-f5cac4b4-bc17-4e00-bc54-215de2a45b63 req-91104998-afdf-4620-bae9-8de169af51cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:11 compute-0 nova_compute[186241]: 2025-11-25 06:27:11.432 186245 DEBUG oslo_concurrency.lockutils [req-f5cac4b4-bc17-4e00-bc54-215de2a45b63 req-91104998-afdf-4620-bae9-8de169af51cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:11 compute-0 nova_compute[186241]: 2025-11-25 06:27:11.432 186245 DEBUG oslo_concurrency.lockutils [req-f5cac4b4-bc17-4e00-bc54-215de2a45b63 req-91104998-afdf-4620-bae9-8de169af51cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:11 compute-0 nova_compute[186241]: 2025-11-25 06:27:11.432 186245 DEBUG nova.compute.manager [req-f5cac4b4-bc17-4e00-bc54-215de2a45b63 req-91104998-afdf-4620-bae9-8de169af51cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] No waiting events found dispatching network-vif-unplugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:27:11 compute-0 nova_compute[186241]: 2025-11-25 06:27:11.432 186245 WARNING nova.compute.manager [req-f5cac4b4-bc17-4e00-bc54-215de2a45b63 req-91104998-afdf-4620-bae9-8de169af51cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received unexpected event network-vif-unplugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 for instance with vm_state active and task_state None.
Nov 25 06:27:11 compute-0 nova_compute[186241]: 2025-11-25 06:27:11.597 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:12 compute-0 nova_compute[186241]: 2025-11-25 06:27:12.007 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:13 compute-0 podman[215490]: 2025-11-25 06:27:13.072309842 +0000 UTC m=+0.045884799 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, config_id=edpm)
Nov 25 06:27:13 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:13.217 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:13 compute-0 nova_compute[186241]: 2025-11-25 06:27:13.677 186245 DEBUG nova.compute.manager [req-5abfc1e0-63fa-4296-90c1-75fe708af4dd req-2f193463-d709-4ea6-a17c-14d36fdde7fb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:13 compute-0 nova_compute[186241]: 2025-11-25 06:27:13.677 186245 DEBUG oslo_concurrency.lockutils [req-5abfc1e0-63fa-4296-90c1-75fe708af4dd req-2f193463-d709-4ea6-a17c-14d36fdde7fb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:13 compute-0 nova_compute[186241]: 2025-11-25 06:27:13.677 186245 DEBUG oslo_concurrency.lockutils [req-5abfc1e0-63fa-4296-90c1-75fe708af4dd req-2f193463-d709-4ea6-a17c-14d36fdde7fb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:13 compute-0 nova_compute[186241]: 2025-11-25 06:27:13.678 186245 DEBUG oslo_concurrency.lockutils [req-5abfc1e0-63fa-4296-90c1-75fe708af4dd req-2f193463-d709-4ea6-a17c-14d36fdde7fb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:13 compute-0 nova_compute[186241]: 2025-11-25 06:27:13.678 186245 DEBUG nova.compute.manager [req-5abfc1e0-63fa-4296-90c1-75fe708af4dd req-2f193463-d709-4ea6-a17c-14d36fdde7fb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] No waiting events found dispatching network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:27:13 compute-0 nova_compute[186241]: 2025-11-25 06:27:13.678 186245 WARNING nova.compute.manager [req-5abfc1e0-63fa-4296-90c1-75fe708af4dd req-2f193463-d709-4ea6-a17c-14d36fdde7fb a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received unexpected event network-vif-plugged-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 for instance with vm_state active and task_state None.
Nov 25 06:27:16 compute-0 nova_compute[186241]: 2025-11-25 06:27:16.599 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:17 compute-0 nova_compute[186241]: 2025-11-25 06:27:17.010 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:19 compute-0 podman[215509]: 2025-11-25 06:27:19.063999437 +0000 UTC m=+0.041680999 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 25 06:27:21 compute-0 nova_compute[186241]: 2025-11-25 06:27:21.601 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:21 compute-0 nova_compute[186241]: 2025-11-25 06:27:21.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:21 compute-0 nova_compute[186241]: 2025-11-25 06:27:21.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:22 compute-0 nova_compute[186241]: 2025-11-25 06:27:22.011 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:22 compute-0 nova_compute[186241]: 2025-11-25 06:27:22.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:23 compute-0 podman[215526]: 2025-11-25 06:27:23.059251511 +0000 UTC m=+0.035464828 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 25 06:27:24 compute-0 nova_compute[186241]: 2025-11-25 06:27:24.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:24 compute-0 nova_compute[186241]: 2025-11-25 06:27:24.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:27:24 compute-0 nova_compute[186241]: 2025-11-25 06:27:24.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:25 compute-0 nova_compute[186241]: 2025-11-25 06:27:25.443 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:25 compute-0 nova_compute[186241]: 2025-11-25 06:27:25.443 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:25 compute-0 nova_compute[186241]: 2025-11-25 06:27:25.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:25 compute-0 nova_compute[186241]: 2025-11-25 06:27:25.444 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.468 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.514 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.514 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.558 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.603 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.753 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.754 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5538MB free_disk=72.98901748657227GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.754 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:26 compute-0 nova_compute[186241]: 2025-11-25 06:27:26.754 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:27 compute-0 nova_compute[186241]: 2025-11-25 06:27:27.012 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:27 compute-0 nova_compute[186241]: 2025-11-25 06:27:27.794 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 90a703a7-09d1-4f58-84e5-80f4083b5922 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:27:27 compute-0 nova_compute[186241]: 2025-11-25 06:27:27.795 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:27:27 compute-0 nova_compute[186241]: 2025-11-25 06:27:27.795 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:27:27 compute-0 nova_compute[186241]: 2025-11-25 06:27:27.827 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:27:28 compute-0 nova_compute[186241]: 2025-11-25 06:27:28.331 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:27:28 compute-0 nova_compute[186241]: 2025-11-25 06:27:28.837 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:27:28 compute-0 nova_compute[186241]: 2025-11-25 06:27:28.838 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:29 compute-0 nova_compute[186241]: 2025-11-25 06:27:29.834 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:29 compute-0 nova_compute[186241]: 2025-11-25 06:27:29.834 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.341 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.341 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.378 186245 WARNING nova.virt.libvirt.driver [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for libvirt event about the detach of device tap6d3aa3ad-5f with device alias net1 from instance 90a703a7-09d1-4f58-84e5-80f4083b5922 is timed out.
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.379 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.381 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <name>instance-00000006</name>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <uuid>90a703a7-09d1-4f58-84e5-80f4083b5922</uuid>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:26:03</nova:creationTime>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:port uuid="6d3aa3ad-5f04-4c0f-bc86-9242dc134214">
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:27:30 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <system>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <entry name='serial'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <entry name='uuid'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </system>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <os>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </os>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <features>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </features>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk' index='2'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config' index='1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:a2:a7:44'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target dev='tap83e4beda-0c'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       </target>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </console>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <video>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </video>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c621,c779</label>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c621,c779</imagelabel>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:30 compute-0 nova_compute[186241]: </domain>
Nov 25 06:27:30 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.382 186245 INFO nova.virt.libvirt.driver [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully detached device tap6d3aa3ad-5f from instance 90a703a7-09d1-4f58-84e5-80f4083b5922 from the live domain config.
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.383 186245 DEBUG nova.virt.libvirt.vif [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:25:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:25:21Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.384 186245 DEBUG nova.network.os_vif_util [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.384 186245 DEBUG nova.network.os_vif_util [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.384 186245 DEBUG os_vif [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.386 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.386 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d3aa3ad-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.390 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.391 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.391 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3158fd8a-ea6e-482d-9df4-9c475095009c) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.392 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.393 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.394 186245 INFO os_vif [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f')
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.395 186245 DEBUG nova.virt.driver [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-2139323515', uuid='90a703a7-09d1-4f58-84e5-80f4083b5922'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052050.3953292) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:27:30 compute-0 nova_compute[186241]: 2025-11-25 06:27:30.396 186245 DEBUG nova.virt.libvirt.guest [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:27:30</nova:creationTime>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:27:30 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:27:30 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:30 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:27:30 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:27:30 compute-0 nova_compute[186241]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Nov 25 06:27:31 compute-0 podman[215554]: 2025-11-25 06:27:31.08560629 +0000 UTC m=+0.056494758 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 06:27:31 compute-0 nova_compute[186241]: 2025-11-25 06:27:31.990 186245 DEBUG nova.compute.manager [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-deleted-6d3aa3ad-5f04-4c0f-bc86-9242dc134214 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:31 compute-0 nova_compute[186241]: 2025-11-25 06:27:31.991 186245 INFO nova.compute.manager [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Neutron deleted interface 6d3aa3ad-5f04-4c0f-bc86-9242dc134214; detaching it from the instance and deleting it from the info cache
Nov 25 06:27:31 compute-0 nova_compute[186241]: 2025-11-25 06:27:31.991 186245 DEBUG nova.network.neutron [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:27:32 compute-0 nova_compute[186241]: 2025-11-25 06:27:32.014 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:32 compute-0 nova_compute[186241]: 2025-11-25 06:27:32.494 186245 DEBUG nova.objects.instance [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lazy-loading 'system_metadata' on Instance uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:27:32 compute-0 nova_compute[186241]: 2025-11-25 06:27:32.648 186245 DEBUG oslo_concurrency.lockutils [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:27:32 compute-0 nova_compute[186241]: 2025-11-25 06:27:32.649 186245 DEBUG oslo_concurrency.lockutils [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:27:32 compute-0 nova_compute[186241]: 2025-11-25 06:27:32.649 186245 DEBUG nova.network.neutron [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:27:32 compute-0 nova_compute[186241]: 2025-11-25 06:27:32.998 186245 DEBUG nova.objects.instance [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lazy-loading 'flavor' on Instance uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.502 186245 DEBUG nova.objects.base [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Object Instance<90a703a7-09d1-4f58-84e5-80f4083b5922> lazy-loaded attributes: system_metadata,flavor wrapper /usr/lib/python3.9/site-packages/nova/objects/base.py:136
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.502 186245 DEBUG nova.virt.libvirt.vif [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:25:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:25:21Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.502 186245 DEBUG nova.network.os_vif_util [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converting VIF {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.503 186245 DEBUG nova.network.os_vif_util [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.505 186245 DEBUG nova.virt.libvirt.guest [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.506 186245 DEBUG nova.virt.libvirt.guest [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <name>instance-00000006</name>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <uuid>90a703a7-09d1-4f58-84e5-80f4083b5922</uuid>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:27:30</nova:creationTime>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:27:33 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <system>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='serial'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='uuid'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </system>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <os>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </os>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <features>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </features>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk' index='2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config' index='1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:a2:a7:44'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target dev='tap83e4beda-0c'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       </target>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </console>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <video>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </video>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c621,c779</label>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c621,c779</imagelabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]: </domain>
Nov 25 06:27:33 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.507 186245 DEBUG nova.virt.libvirt.guest [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.509 186245 DEBUG nova.virt.libvirt.guest [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:15:cf:0b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6d3aa3ad-5f"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <name>instance-00000006</name>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <uuid>90a703a7-09d1-4f58-84e5-80f4083b5922</uuid>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:27:30</nova:creationTime>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:27:33 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <memory unit='KiB'>131072</memory>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <vcpu placement='static'>1</vcpu>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <resource>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <partition>/machine</partition>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </resource>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <sysinfo type='smbios'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <system>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='manufacturer'>RDO</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='product'>OpenStack Compute</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='version'>31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='serial'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='uuid'>90a703a7-09d1-4f58-84e5-80f4083b5922</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <entry name='family'>Virtual Machine</entry>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </system>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <os>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <boot dev='hd'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <smbios mode='sysinfo'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </os>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <features>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <vmcoreinfo state='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </features>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <cpu mode='custom' match='exact' check='full'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <model fallback='forbid'>EPYC-Milan</model>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <vendor>AMD</vendor>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='x2apic'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc-deadline'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='hypervisor'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='tsc_adjust'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='vaes'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='vpclmulqdq'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='spec-ctrl'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='stibp'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='ssbd'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='cmp_legacy'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='overflow-recov'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='succor'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='virt-ssbd'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='lbrv'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='tsc-scale'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='vmcb-clean'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='flushbyasid'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='pause-filter'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='pfthreshold'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='v-vmsave-vmload'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='vgif'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='svm'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='require' name='topoext'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='npt'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='nrip-save'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <clock offset='utc'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <timer name='pit' tickpolicy='delay'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <timer name='hpet' present='no'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <on_poweroff>destroy</on_poweroff>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <on_reboot>restart</on_reboot>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <on_crash>destroy</on_crash>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <disk type='file' device='disk'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk' index='2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <backingStore type='file' index='3'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <format type='raw'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <source file='/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <backingStore/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       </backingStore>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target dev='vda' bus='virtio'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='virtio-disk0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <disk type='file' device='cdrom'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <driver name='qemu' type='raw' cache='none'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/disk.config' index='1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <backingStore/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target dev='sda' bus='sata'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <readonly/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='sata0-0-0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='0' model='pcie-root'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pcie.0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='1' port='0x10'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='2' port='0x11'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='3' port='0x12'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='4' port='0x13'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='5' port='0x14'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='6' port='0x15'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='7' port='0x16'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='8' port='0x17'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.8'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='9' port='0x18'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.9'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='10' port='0x19'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.10'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='11' port='0x1a'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.11'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='12' port='0x1b'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.12'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='13' port='0x1c'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.13'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='14' port='0x1d'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.14'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='15' port='0x1e'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.15'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='16' port='0x1f'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.16'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='17' port='0x20'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.17'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='18' port='0x21'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.18'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='19' port='0x22'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.19'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='20' port='0x23'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.20'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='21' port='0x24'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.21'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='22' port='0x25'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.22'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='23' port='0x26'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.23'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='24' port='0x27'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.24'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-root-port'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target chassis='25' port='0x28'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.25'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model name='pcie-pci-bridge'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='pci.26'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='usb'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <controller type='sata' index='0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='ide'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </controller>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <interface type='ethernet'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <mac address='fa:16:3e:a2:a7:44'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target dev='tap83e4beda-0c'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model type='virtio'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <driver name='vhost' rx_queue_size='512'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <mtu size='1442'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='net0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <serial type='pty'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target type='isa-serial' port='0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:         <model name='isa-serial'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       </target>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <console type='pty' tty='/dev/pts/0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <source path='/dev/pts/0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <log file='/var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922/console.log' append='off'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <target type='serial' port='0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='serial0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </console>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <input type='tablet' bus='usb'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='input0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='usb' bus='0' port='1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <input type='mouse' bus='ps2'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='input1'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <input type='keyboard' bus='ps2'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='input2'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </input>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <listen type='address' address='::0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </graphics>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <audio id='1' type='none'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <video>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <model type='virtio' heads='1' primary='yes'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='video0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </video>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <watchdog model='itco' action='reset'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='watchdog0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </watchdog>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <memballoon model='virtio'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <stats period='10'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='balloon0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <rng model='virtio'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <backend model='random'>/dev/urandom</backend>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <alias name='rng0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <label>system_u:system_r:svirt_t:s0:c621,c779</label>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c621,c779</imagelabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <label>+107:+107</label>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <imagelabel>+107:+107</imagelabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </seclabel>
Nov 25 06:27:33 compute-0 nova_compute[186241]: </domain>
Nov 25 06:27:33 compute-0 nova_compute[186241]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.510 186245 WARNING nova.virt.libvirt.driver [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Detaching interface fa:16:3e:15:cf:0b failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap6d3aa3ad-5f' not found.
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.510 186245 DEBUG nova.virt.libvirt.vif [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:25:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:25:21Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.510 186245 DEBUG nova.network.os_vif_util [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converting VIF {"id": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "address": "fa:16:3e:15:cf:0b", "network": {"id": "7764c441-3630-43ef-a835-62532c499c69", "bridge": "br-int", "label": "tempest-network-smoke--723277504", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d3aa3ad-5f", "ovs_interfaceid": "6d3aa3ad-5f04-4c0f-bc86-9242dc134214", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.511 186245 DEBUG nova.network.os_vif_util [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.511 186245 DEBUG os_vif [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.512 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.512 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d3aa3ad-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.512 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.514 186245 INFO os_vif [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:cf:0b,bridge_name='br-int',has_traffic_filtering=True,id=6d3aa3ad-5f04-4c0f-bc86-9242dc134214,network=Network(7764c441-3630-43ef-a835-62532c499c69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d3aa3ad-5f')
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.514 186245 DEBUG nova.virt.driver [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-2139323515', uuid='90a703a7-09d1-4f58-84e5-80f4083b5922'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus='sata',hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus='virtio',hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus='usb',hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type='q35',hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model='usbtablet',hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model='virtio',hw_video_ram=<?>,hw_vif_model='virtio',hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052053.5145738) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:27:33 compute-0 nova_compute[186241]: 2025-11-25 06:27:33.515 186245 DEBUG nova.virt.libvirt.guest [req-aab5c574-36b8-4734-81f4-f750226b7648 req-2cc38958-96c3-4f66-8b5e-64268e268cd2 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:name>tempest-TestNetworkBasicOps-server-2139323515</nova:name>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:creationTime>2025-11-25 06:27:33</nova:creationTime>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:flavor name="m1.nano">
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:memory>128</nova:memory>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:disk>1</nova:disk>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:swap>0</nova:swap>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:vcpus>1</nova:vcpus>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:flavor>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:owner>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:owner>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   <nova:ports>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     <nova:port uuid="83e4beda-0cfb-4824-8d25-0345811c9a67">
Nov 25 06:27:33 compute-0 nova_compute[186241]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 06:27:33 compute-0 nova_compute[186241]:     </nova:port>
Nov 25 06:27:33 compute-0 nova_compute[186241]:   </nova:ports>
Nov 25 06:27:33 compute-0 nova_compute[186241]: </nova:instance>
Nov 25 06:27:33 compute-0 nova_compute[186241]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:356
Nov 25 06:27:34 compute-0 ovn_controller[95135]: 2025-11-25T06:27:34Z|00118|binding|INFO|Releasing lport adae0e10-6930-4117-9b4f-dd5ad7e75d7a from this chassis (sb_readonly=0)
Nov 25 06:27:34 compute-0 nova_compute[186241]: 2025-11-25 06:27:34.471 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:35 compute-0 podman[215577]: 2025-11-25 06:27:35.064774092 +0000 UTC m=+0.039097012 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:27:35 compute-0 podman[215578]: 2025-11-25 06:27:35.08694931 +0000 UTC m=+0.060957165 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:27:35 compute-0 nova_compute[186241]: 2025-11-25 06:27:35.393 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:35 compute-0 nova_compute[186241]: 2025-11-25 06:27:35.500 186245 DEBUG nova.compute.manager [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-changed-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:35 compute-0 nova_compute[186241]: 2025-11-25 06:27:35.500 186245 DEBUG nova.compute.manager [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing instance network info cache due to event network-changed-83e4beda-0cfb-4824-8d25-0345811c9a67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:27:35 compute-0 nova_compute[186241]: 2025-11-25 06:27:35.501 186245 DEBUG oslo_concurrency.lockutils [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.069 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.069 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.069 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.069 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.069 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.070 186245 INFO nova.compute.manager [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Terminating instance
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.573 186245 DEBUG nova.compute.manager [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:27:36 compute-0 kernel: tap83e4beda-0c (unregistering): left promiscuous mode
Nov 25 06:27:36 compute-0 NetworkManager[55345]: <info>  [1764052056.5992] device (tap83e4beda-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:27:36 compute-0 ovn_controller[95135]: 2025-11-25T06:27:36Z|00119|binding|INFO|Releasing lport 83e4beda-0cfb-4824-8d25-0345811c9a67 from this chassis (sb_readonly=0)
Nov 25 06:27:36 compute-0 ovn_controller[95135]: 2025-11-25T06:27:36Z|00120|binding|INFO|Setting lport 83e4beda-0cfb-4824-8d25-0345811c9a67 down in Southbound
Nov 25 06:27:36 compute-0 ovn_controller[95135]: 2025-11-25T06:27:36Z|00121|binding|INFO|Removing iface tap83e4beda-0c ovn-installed in OVS
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.602 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.604 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.607 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:a7:44 10.100.0.5'], port_security=['fa:16:3e:a2:a7:44 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '90a703a7-09d1-4f58-84e5-80f4083b5922', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'aee7b322-406a-47f1-954e-0d371991f172', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d69e634-57f2-49e0-8c89-35d178b67c36, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=83e4beda-0cfb-4824-8d25-0345811c9a67) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.608 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 83e4beda-0cfb-4824-8d25-0345811c9a67 in datapath bdd0af2e-c79c-421a-a113-be4d7ab826e9 unbound from our chassis
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.609 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bdd0af2e-c79c-421a-a113-be4d7ab826e9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.609 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[880a43ab-044f-4b90-b673-62488694ca4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.610 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9 namespace which is not needed anymore
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.618 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:36 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 25 06:27:36 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 13.967s CPU time.
Nov 25 06:27:36 compute-0 systemd-machined[152921]: Machine qemu-6-instance-00000006 terminated.
Nov 25 06:27:36 compute-0 neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9[214719]: [NOTICE]   (214723) : haproxy version is 2.8.14-c23fe91
Nov 25 06:27:36 compute-0 neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9[214719]: [NOTICE]   (214723) : path to executable is /usr/sbin/haproxy
Nov 25 06:27:36 compute-0 neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9[214719]: [WARNING]  (214723) : Exiting Master process...
Nov 25 06:27:36 compute-0 podman[215637]: 2025-11-25 06:27:36.691594515 +0000 UTC m=+0.020323661 container kill bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:27:36 compute-0 neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9[214719]: [ALERT]    (214723) : Current worker (214725) exited with code 143 (Terminated)
Nov 25 06:27:36 compute-0 neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9[214719]: [WARNING]  (214723) : All workers exited. Exiting... (0)
Nov 25 06:27:36 compute-0 systemd[1]: libpod-bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49.scope: Deactivated successfully.
Nov 25 06:27:36 compute-0 podman[215649]: 2025-11-25 06:27:36.724059831 +0000 UTC m=+0.016907655 container died bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 25 06:27:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49-userdata-shm.mount: Deactivated successfully.
Nov 25 06:27:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-07ab3e65389ddb6ffb244ead935183eee96a6bcc9748b35cbe3f2448b6efe311-merged.mount: Deactivated successfully.
Nov 25 06:27:36 compute-0 podman[215649]: 2025-11-25 06:27:36.742579242 +0000 UTC m=+0.035427066 container cleanup bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:27:36 compute-0 systemd[1]: libpod-conmon-bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49.scope: Deactivated successfully.
Nov 25 06:27:36 compute-0 podman[215650]: 2025-11-25 06:27:36.749976939 +0000 UTC m=+0.040839675 container remove bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.753 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[33db1ceb-2d9b-45d6-97de-eea6353546ee]: (4, ("Tue Nov 25 06:27:36 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9 (bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49)\nbb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49\nTue Nov 25 06:27:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9 (bb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49)\nbb6ae437b7782357109ea9947b6992cf385622695a70dcba3e9adda4928faa49\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.754 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[be6df5cc-2edf-4f29-9507-67624b618dd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.754 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bdd0af2e-c79c-421a-a113-be4d7ab826e9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.755 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1da6fca3-3d4e-41eb-a8e8-e3b234b6f1ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.755 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbdd0af2e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.757 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:36 compute-0 kernel: tapbdd0af2e-c0: left promiscuous mode
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.772 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.774 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1680080d-fac6-4206-a017-efaf4baebf2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.781 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3d198a55-8c66-48a3-b273-97d0ad35d1af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.782 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0fe214-98d7-4833-8402-35bf3f1d556c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.795 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f69f4db3-1fe4-4a6f-8bdc-4137481bfc4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290829, 'reachable_time': 25232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215679, 'error': None, 'target': 'ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.799 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bdd0af2e-c79c-421a-a113-be4d7ab826e9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:27:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:36.799 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[3198cebf-54e2-45a7-8db6-d6a2cc816934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:36 compute-0 systemd[1]: run-netns-ovnmeta\x2dbdd0af2e\x2dc79c\x2d421a\x2da113\x2dbe4d7ab826e9.mount: Deactivated successfully.
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.810 186245 INFO nova.virt.libvirt.driver [-] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Instance destroyed successfully.
Nov 25 06:27:36 compute-0 nova_compute[186241]: 2025-11-25 06:27:36.811 186245 DEBUG nova.objects.instance [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 90a703a7-09d1-4f58-84e5-80f4083b5922 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.016 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.255 186245 DEBUG nova.network.neutron [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.314 186245 DEBUG nova.virt.libvirt.vif [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:25:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2139323515',display_name='tempest-TestNetworkBasicOps-server-2139323515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2139323515',id=6,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPfG/Hr03+7kUyqdyJ4VcrC6OgJZvQPY0869e/9DA7kenSXh4EDJbfr323zFsTAZ1JBig6V1BBInXPavwPrKol6GncaRLGsPY2WM3LUFf75N9E/ms8i8IlOrkZUHQpzmFA==',key_name='tempest-TestNetworkBasicOps-941751953',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:25:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4i1j2g0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:25:21Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=90a703a7-09d1-4f58-84e5-80f4083b5922,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.314 186245 DEBUG nova.network.os_vif_util [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.314 186245 DEBUG nova.network.os_vif_util [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:a7:44,bridge_name='br-int',has_traffic_filtering=True,id=83e4beda-0cfb-4824-8d25-0345811c9a67,network=Network(bdd0af2e-c79c-421a-a113-be4d7ab826e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83e4beda-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.315 186245 DEBUG os_vif [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:a7:44,bridge_name='br-int',has_traffic_filtering=True,id=83e4beda-0cfb-4824-8d25-0345811c9a67,network=Network(bdd0af2e-c79c-421a-a113-be4d7ab826e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83e4beda-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.316 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.316 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83e4beda-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.317 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.318 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.318 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.318 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=526a557f-6104-4183-92ec-dc176bfd84ad) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.319 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.319 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.321 186245 INFO os_vif [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:a7:44,bridge_name='br-int',has_traffic_filtering=True,id=83e4beda-0cfb-4824-8d25-0345811c9a67,network=Network(bdd0af2e-c79c-421a-a113-be4d7ab826e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83e4beda-0c')
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.321 186245 INFO nova.virt.libvirt.driver [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Deleting instance files /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922_del
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.322 186245 INFO nova.virt.libvirt.driver [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Deletion of /var/lib/nova/instances/90a703a7-09d1-4f58-84e5-80f4083b5922_del complete
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.637 186245 DEBUG nova.compute.manager [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-unplugged-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.637 186245 DEBUG oslo_concurrency.lockutils [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.637 186245 DEBUG oslo_concurrency.lockutils [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG oslo_concurrency.lockutils [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG nova.compute.manager [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] No waiting events found dispatching network-vif-unplugged-83e4beda-0cfb-4824-8d25-0345811c9a67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG nova.compute.manager [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-unplugged-83e4beda-0cfb-4824-8d25-0345811c9a67 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG nova.compute.manager [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG oslo_concurrency.lockutils [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG oslo_concurrency.lockutils [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG oslo_concurrency.lockutils [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.638 186245 DEBUG nova.compute.manager [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] No waiting events found dispatching network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.639 186245 WARNING nova.compute.manager [req-34ceba08-f0c4-4c06-b1a1-420cd745ae3a req-f276d2d2-2602-4974-bf08-ff5e704b86b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received unexpected event network-vif-plugged-83e4beda-0cfb-4824-8d25-0345811c9a67 for instance with vm_state active and task_state deleting.
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.758 186245 DEBUG oslo_concurrency.lockutils [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.760 186245 DEBUG oslo_concurrency.lockutils [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.760 186245 DEBUG nova.network.neutron [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Refreshing network info cache for port 83e4beda-0cfb-4824-8d25-0345811c9a67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.829 186245 INFO nova.compute.manager [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.829 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.830 186245 DEBUG nova.compute.manager [-] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:27:37 compute-0 nova_compute[186241]: 2025-11-25 06:27:37.830 186245 DEBUG nova.network.neutron [-] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:27:38 compute-0 nova_compute[186241]: 2025-11-25 06:27:38.264 186245 DEBUG oslo_concurrency.lockutils [None req-f11a509f-d264-4668-af26-e3811f14c5c2 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "interface-90a703a7-09d1-4f58-84e5-80f4083b5922-6d3aa3ad-5f04-4c0f-bc86-9242dc134214" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 29.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:39 compute-0 podman[215695]: 2025-11-25 06:27:39.055109913 +0000 UTC m=+0.032424019 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:27:39 compute-0 nova_compute[186241]: 2025-11-25 06:27:39.162 186245 DEBUG nova.network.neutron [-] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:27:39 compute-0 nova_compute[186241]: 2025-11-25 06:27:39.665 186245 INFO nova.compute.manager [-] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Took 1.84 seconds to deallocate network for instance.
Nov 25 06:27:39 compute-0 nova_compute[186241]: 2025-11-25 06:27:39.782 186245 DEBUG nova.compute.manager [req-9207c1d1-1d7f-47a6-baaf-2154cc94a4ff req-45a94f38-7cef-4620-8b9e-4f3e47cfb34b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Received event network-vif-deleted-83e4beda-0cfb-4824-8d25-0345811c9a67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:27:40 compute-0 nova_compute[186241]: 2025-11-25 06:27:40.170 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:40 compute-0 nova_compute[186241]: 2025-11-25 06:27:40.171 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:40 compute-0 nova_compute[186241]: 2025-11-25 06:27:40.212 186245 DEBUG nova.compute.provider_tree [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:27:40 compute-0 nova_compute[186241]: 2025-11-25 06:27:40.716 186245 DEBUG nova.scheduler.client.report [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:27:41 compute-0 nova_compute[186241]: 2025-11-25 06:27:41.221 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:41 compute-0 nova_compute[186241]: 2025-11-25 06:27:41.244 186245 INFO nova.scheduler.client.report [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 90a703a7-09d1-4f58-84e5-80f4083b5922
Nov 25 06:27:42 compute-0 nova_compute[186241]: 2025-11-25 06:27:42.018 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:42 compute-0 nova_compute[186241]: 2025-11-25 06:27:42.253 186245 DEBUG oslo_concurrency.lockutils [None req-946b8432-dc8a-48ce-b930-5347f347c08f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "90a703a7-09d1-4f58-84e5-80f4083b5922" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:42 compute-0 nova_compute[186241]: 2025-11-25 06:27:42.319 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:42 compute-0 nova_compute[186241]: 2025-11-25 06:27:42.531 186245 DEBUG nova.network.neutron [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updated VIF entry in instance network info cache for port 83e4beda-0cfb-4824-8d25-0345811c9a67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:27:42 compute-0 nova_compute[186241]: 2025-11-25 06:27:42.531 186245 DEBUG nova.network.neutron [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 90a703a7-09d1-4f58-84e5-80f4083b5922] Updating instance_info_cache with network_info: [{"id": "83e4beda-0cfb-4824-8d25-0345811c9a67", "address": "fa:16:3e:a2:a7:44", "network": {"id": "bdd0af2e-c79c-421a-a113-be4d7ab826e9", "bridge": "br-int", "label": "tempest-network-smoke--82083730", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83e4beda-0c", "ovs_interfaceid": "83e4beda-0cfb-4824-8d25-0345811c9a67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:27:43 compute-0 nova_compute[186241]: 2025-11-25 06:27:43.034 186245 DEBUG oslo_concurrency.lockutils [req-7e20a6db-8c90-4d79-b610-c5360629683a req-86aec5a0-5916-4b5a-b131-a279d6f2b63c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-90a703a7-09d1-4f58-84e5-80f4083b5922" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:27:44 compute-0 podman[215711]: 2025-11-25 06:27:44.062211758 +0000 UTC m=+0.040766597 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 06:27:47 compute-0 nova_compute[186241]: 2025-11-25 06:27:47.018 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:47 compute-0 nova_compute[186241]: 2025-11-25 06:27:47.320 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:47.636 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:27:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:47.636 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:27:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:47.637 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:27:48 compute-0 nova_compute[186241]: 2025-11-25 06:27:48.511 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:48 compute-0 nova_compute[186241]: 2025-11-25 06:27:48.587 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:50 compute-0 podman[215731]: 2025-11-25 06:27:50.057140147 +0000 UTC m=+0.037294606 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 25 06:27:52 compute-0 nova_compute[186241]: 2025-11-25 06:27:52.022 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:52 compute-0 nova_compute[186241]: 2025-11-25 06:27:52.321 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:53 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:53.673 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:8d:19 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ac93797-d190-4534-9cfc-8a64cabfa9fd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8b4b1775-6249-4ffb-bb90-6cf9cfca84ca) old=Port_Binding(mac=['fa:16:3e:17:8d:19'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:27:53 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:53.674 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8b4b1775-6249-4ffb-bb90-6cf9cfca84ca in datapath 1d238697-f844-4698-9f1c-19ed6cd73eb8 updated
Nov 25 06:27:53 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:53.674 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d238697-f844-4698-9f1c-19ed6cd73eb8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:27:53 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:27:53.675 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[801f773a-95ad-431c-998e-c6702add27b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:27:54 compute-0 podman[215748]: 2025-11-25 06:27:54.061918498 +0000 UTC m=+0.036482574 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:27:57 compute-0 nova_compute[186241]: 2025-11-25 06:27:57.024 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:57 compute-0 nova_compute[186241]: 2025-11-25 06:27:57.322 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.551 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:27:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:27:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:28:02 compute-0 nova_compute[186241]: 2025-11-25 06:28:02.025 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:02 compute-0 podman[215769]: 2025-11-25 06:28:02.071740271 +0000 UTC m=+0.051307406 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 25 06:28:02 compute-0 nova_compute[186241]: 2025-11-25 06:28:02.324 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:04 compute-0 nova_compute[186241]: 2025-11-25 06:28:04.943 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:04 compute-0 nova_compute[186241]: 2025-11-25 06:28:04.943 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:05 compute-0 nova_compute[186241]: 2025-11-25 06:28:05.445 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:28:05 compute-0 nova_compute[186241]: 2025-11-25 06:28:05.977 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:05 compute-0 nova_compute[186241]: 2025-11-25 06:28:05.977 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:05 compute-0 nova_compute[186241]: 2025-11-25 06:28:05.982 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:28:05 compute-0 nova_compute[186241]: 2025-11-25 06:28:05.982 186245 INFO nova.compute.claims [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:28:06 compute-0 podman[215792]: 2025-11-25 06:28:06.064966838 +0000 UTC m=+0.042632412 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 25 06:28:06 compute-0 podman[215793]: 2025-11-25 06:28:06.07008509 +0000 UTC m=+0.044871520 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:28:07 compute-0 nova_compute[186241]: 2025-11-25 06:28:07.026 186245 DEBUG nova.compute.provider_tree [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:28:07 compute-0 nova_compute[186241]: 2025-11-25 06:28:07.027 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:07 compute-0 nova_compute[186241]: 2025-11-25 06:28:07.325 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:07 compute-0 nova_compute[186241]: 2025-11-25 06:28:07.530 186245 DEBUG nova.scheduler.client.report [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:28:08 compute-0 nova_compute[186241]: 2025-11-25 06:28:08.035 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:08 compute-0 nova_compute[186241]: 2025-11-25 06:28:08.035 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:28:08 compute-0 nova_compute[186241]: 2025-11-25 06:28:08.544 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:28:08 compute-0 nova_compute[186241]: 2025-11-25 06:28:08.544 186245 DEBUG nova.network.neutron [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:28:09 compute-0 nova_compute[186241]: 2025-11-25 06:28:09.048 186245 INFO nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:28:09 compute-0 nova_compute[186241]: 2025-11-25 06:28:09.549 186245 DEBUG nova.policy [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:28:09 compute-0 nova_compute[186241]: 2025-11-25 06:28:09.552 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:28:10 compute-0 podman[215830]: 2025-11-25 06:28:10.087913708 +0000 UTC m=+0.063905490 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.563 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.564 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.565 186245 INFO nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Creating image(s)
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.565 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.566 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.566 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.567 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.569 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.571 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:10.615 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.615 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:10 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:10.615 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.616 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.616 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.617 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.621 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.621 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.632 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.665 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.666 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.686 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.687 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.688 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.732 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.733 186245 DEBUG nova.virt.disk.api [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.733 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.777 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.778 186245 DEBUG nova.virt.disk.api [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.779 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.779 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Ensure instance console log exists: /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.779 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.780 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:10 compute-0 nova_compute[186241]: 2025-11-25 06:28:10.780 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:11 compute-0 nova_compute[186241]: 2025-11-25 06:28:11.240 186245 DEBUG nova.network.neutron [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Successfully updated port: 3430a31e-7faf-4e40-951a-5767c915e85e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:28:11 compute-0 nova_compute[186241]: 2025-11-25 06:28:11.412 186245 DEBUG nova.compute.manager [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received event network-changed-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:28:11 compute-0 nova_compute[186241]: 2025-11-25 06:28:11.413 186245 DEBUG nova.compute.manager [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Refreshing instance network info cache due to event network-changed-3430a31e-7faf-4e40-951a-5767c915e85e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:28:11 compute-0 nova_compute[186241]: 2025-11-25 06:28:11.413 186245 DEBUG oslo_concurrency.lockutils [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:28:11 compute-0 nova_compute[186241]: 2025-11-25 06:28:11.413 186245 DEBUG oslo_concurrency.lockutils [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:28:11 compute-0 nova_compute[186241]: 2025-11-25 06:28:11.413 186245 DEBUG nova.network.neutron [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Refreshing network info cache for port 3430a31e-7faf-4e40-951a-5767c915e85e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:28:11 compute-0 nova_compute[186241]: 2025-11-25 06:28:11.745 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:28:12 compute-0 nova_compute[186241]: 2025-11-25 06:28:12.029 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:12 compute-0 nova_compute[186241]: 2025-11-25 06:28:12.326 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:13 compute-0 nova_compute[186241]: 2025-11-25 06:28:13.248 186245 DEBUG nova.network.neutron [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:28:14 compute-0 nova_compute[186241]: 2025-11-25 06:28:14.529 186245 DEBUG nova.network.neutron [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:28:15 compute-0 nova_compute[186241]: 2025-11-25 06:28:15.033 186245 DEBUG oslo_concurrency.lockutils [req-f1562659-6b3b-4c70-b644-4902175dcc36 req-3fd51621-cd87-401c-b507-0b8bf71b35a8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:28:15 compute-0 nova_compute[186241]: 2025-11-25 06:28:15.033 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:28:15 compute-0 nova_compute[186241]: 2025-11-25 06:28:15.033 186245 DEBUG nova.network.neutron [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:28:15 compute-0 podman[215862]: 2025-11-25 06:28:15.066043877 +0000 UTC m=+0.042771939 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible)
Nov 25 06:28:16 compute-0 nova_compute[186241]: 2025-11-25 06:28:16.244 186245 DEBUG nova.network.neutron [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:28:17 compute-0 nova_compute[186241]: 2025-11-25 06:28:17.031 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:17 compute-0 nova_compute[186241]: 2025-11-25 06:28:17.328 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.127 186245 DEBUG nova.network.neutron [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Updating instance_info_cache with network_info: [{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.631 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.631 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Instance network_info: |[{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.633 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Start _get_guest_xml network_info=[{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.636 186245 WARNING nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.637 186245 DEBUG nova.virt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-330519817', uuid='9d4b7e67-a66f-4e43-9fac-512ecfb6735f'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052098.6372223) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.643 186245 DEBUG nova.virt.libvirt.host [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.644 186245 DEBUG nova.virt.libvirt.host [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.646 186245 DEBUG nova.virt.libvirt.host [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.646 186245 DEBUG nova.virt.libvirt.host [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.646 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.647 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.647 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.647 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.647 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.648 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.648 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.648 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.648 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.648 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.648 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.649 186245 DEBUG nova.virt.hardware [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.651 186245 DEBUG nova.virt.libvirt.vif [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:28:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-330519817',display_name='tempest-TestNetworkBasicOps-server-330519817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-330519817',id=8,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMSJey103Q0F2kp7X2sove4lipBsQ5vCuSrfn3Kx/yMSoOS9p+VfHjfGFVVtCd2mWHqAbICUsQ92fAP87X+wL+17Ciim1qS3aDJxN3Q4K6/UGI2BXPTdghfszIFZpVkZkg==',key_name='tempest-TestNetworkBasicOps-530301255',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-w0qs73q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:28:09Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=9d4b7e67-a66f-4e43-9fac-512ecfb6735f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.651 186245 DEBUG nova.network.os_vif_util [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.652 186245 DEBUG nova.network.os_vif_util [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:28:18 compute-0 nova_compute[186241]: 2025-11-25 06:28:18.653 186245 DEBUG nova.objects.instance [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d4b7e67-a66f-4e43-9fac-512ecfb6735f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.157 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <uuid>9d4b7e67-a66f-4e43-9fac-512ecfb6735f</uuid>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <name>instance-00000008</name>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-330519817</nova:name>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:28:18</nova:creationTime>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         <nova:port uuid="3430a31e-7faf-4e40-951a-5767c915e85e">
Nov 25 06:28:19 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <system>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <entry name="serial">9d4b7e67-a66f-4e43-9fac-512ecfb6735f</entry>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <entry name="uuid">9d4b7e67-a66f-4e43-9fac-512ecfb6735f</entry>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </system>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <os>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   </os>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <features>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   </features>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk.config"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:f1:54:ff"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <target dev="tap3430a31e-7f"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/console.log" append="off"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <video>
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </video>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:28:19 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:28:19 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:28:19 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:28:19 compute-0 nova_compute[186241]: </domain>
Nov 25 06:28:19 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.158 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Preparing to wait for external event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.158 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.158 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.158 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.159 186245 DEBUG nova.virt.libvirt.vif [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:28:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-330519817',display_name='tempest-TestNetworkBasicOps-server-330519817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-330519817',id=8,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMSJey103Q0F2kp7X2sove4lipBsQ5vCuSrfn3Kx/yMSoOS9p+VfHjfGFVVtCd2mWHqAbICUsQ92fAP87X+wL+17Ciim1qS3aDJxN3Q4K6/UGI2BXPTdghfszIFZpVkZkg==',key_name='tempest-TestNetworkBasicOps-530301255',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-w0qs73q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:28:09Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=9d4b7e67-a66f-4e43-9fac-512ecfb6735f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.159 186245 DEBUG nova.network.os_vif_util [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.160 186245 DEBUG nova.network.os_vif_util [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.160 186245 DEBUG os_vif [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.160 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.161 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.161 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.162 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.162 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1190fdd6-a6cf-5d58-9d9b-b4ab8a93d073', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.167 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.169 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.169 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3430a31e-7f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.169 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3430a31e-7f, col_values=(('qos', UUID('e8375fe8-80cc-44a2-b1e3-c7217060e1ef')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.169 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3430a31e-7f, col_values=(('external_ids', {'iface-id': '3430a31e-7faf-4e40-951a-5767c915e85e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:54:ff', 'vm-uuid': '9d4b7e67-a66f-4e43-9fac-512ecfb6735f'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.170 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:19 compute-0 NetworkManager[55345]: <info>  [1764052099.1717] manager: (tap3430a31e-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.172 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.175 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:19 compute-0 nova_compute[186241]: 2025-11-25 06:28:19.175 186245 INFO os_vif [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f')
Nov 25 06:28:19 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:19.617 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:20 compute-0 nova_compute[186241]: 2025-11-25 06:28:20.702 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:28:20 compute-0 nova_compute[186241]: 2025-11-25 06:28:20.703 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:28:20 compute-0 nova_compute[186241]: 2025-11-25 06:28:20.703 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:f1:54:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:28:20 compute-0 nova_compute[186241]: 2025-11-25 06:28:20.704 186245 INFO nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Using config drive
Nov 25 06:28:21 compute-0 podman[215883]: 2025-11-25 06:28:21.086897992 +0000 UTC m=+0.065308026 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 25 06:28:21 compute-0 nova_compute[186241]: 2025-11-25 06:28:21.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:22 compute-0 nova_compute[186241]: 2025-11-25 06:28:22.033 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:22 compute-0 nova_compute[186241]: 2025-11-25 06:28:22.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.239 186245 INFO nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Creating config drive at /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk.config
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.244 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpy68xjqw4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.361 186245 DEBUG oslo_concurrency.processutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpy68xjqw4" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:23 compute-0 NetworkManager[55345]: <info>  [1764052103.3985] manager: (tap3430a31e-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Nov 25 06:28:23 compute-0 kernel: tap3430a31e-7f: entered promiscuous mode
Nov 25 06:28:23 compute-0 ovn_controller[95135]: 2025-11-25T06:28:23Z|00122|binding|INFO|Claiming lport 3430a31e-7faf-4e40-951a-5767c915e85e for this chassis.
Nov 25 06:28:23 compute-0 ovn_controller[95135]: 2025-11-25T06:28:23Z|00123|binding|INFO|3430a31e-7faf-4e40-951a-5767c915e85e: Claiming fa:16:3e:f1:54:ff 10.100.0.12
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.400 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.418 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:54:ff 10.100.0.12'], port_security=['fa:16:3e:f1:54:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9d4b7e67-a66f-4e43-9fac-512ecfb6735f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dafbde12-3514-4e2d-980f-9529576187d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ac93797-d190-4534-9cfc-8a64cabfa9fd, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=3430a31e-7faf-4e40-951a-5767c915e85e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.418 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 3430a31e-7faf-4e40-951a-5767c915e85e in datapath 1d238697-f844-4698-9f1c-19ed6cd73eb8 bound to our chassis
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.420 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d238697-f844-4698-9f1c-19ed6cd73eb8
Nov 25 06:28:23 compute-0 systemd-udevd[215916]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:28:23 compute-0 NetworkManager[55345]: <info>  [1764052103.4349] device (tap3430a31e-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:28:23 compute-0 NetworkManager[55345]: <info>  [1764052103.4357] device (tap3430a31e-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.429 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[70d31a83-6221-4fa2-8f88-81a8c42d9690]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.435 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d238697-f1 in ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.437 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d238697-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.437 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[95cccc2c-96a1-40e4-a3ac-08430d6b5a65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 systemd-machined[152921]: New machine qemu-8-instance-00000008.
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.437 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[acce9361-6080-42d3-98e0-7151341cba0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.448 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a52f4f-0572-4f50-867f-1ec6dcc6993c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.461 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.460 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[31fa4b97-a094-4387-9235-add6fc90599e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_controller[95135]: 2025-11-25T06:28:23Z|00124|binding|INFO|Setting lport 3430a31e-7faf-4e40-951a-5767c915e85e ovn-installed in OVS
Nov 25 06:28:23 compute-0 ovn_controller[95135]: 2025-11-25T06:28:23Z|00125|binding|INFO|Setting lport 3430a31e-7faf-4e40-951a-5767c915e85e up in Southbound
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.468 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.482 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8d3e42-01d5-4d92-809c-95f2545681e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 NetworkManager[55345]: <info>  [1764052103.4862] manager: (tap1d238697-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/67)
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.487 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bd09e7b4-cb0f-4551-9ed1-868280364df2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.511 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[67bfb231-f6d8-46fd-a35c-b9e40508bcb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.513 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[35b77ad5-474a-48ed-b623-6ef23a083415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 NetworkManager[55345]: <info>  [1764052103.5295] device (tap1d238697-f0): carrier: link connected
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.533 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[76cd5837-fdec-43e1-996c-907dea56e0fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.545 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d89cea5b-4239-439e-9fa0-0e2accea228c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d238697-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:8d:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309193, 'reachable_time': 24829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215943, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.557 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a974c7-83eb-4615-8b5f-ebdff269ddf5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:8d19'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 309193, 'tstamp': 309193}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215944, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.569 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a843cc-1c01-40e4-86a1-2fc89470f16e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d238697-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:8d:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309193, 'reachable_time': 24829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215945, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.590 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[847b4752-1175-445a-a4ef-856a4284b02b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.631 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbc2d63-2988-47f3-9f8c-f23880e0f4cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.632 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d238697-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.632 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.632 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d238697-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:23 compute-0 kernel: tap1d238697-f0: entered promiscuous mode
Nov 25 06:28:23 compute-0 NetworkManager[55345]: <info>  [1764052103.6343] manager: (tap1d238697-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.633 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.636 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d238697-f0, col_values=(('external_ids', {'iface-id': '8b4b1775-6249-4ffb-bb90-6cf9cfca84ca'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:23 compute-0 ovn_controller[95135]: 2025-11-25T06:28:23Z|00126|binding|INFO|Releasing lport 8b4b1775-6249-4ffb-bb90-6cf9cfca84ca from this chassis (sb_readonly=0)
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.649 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.650 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[47f63831-d1a6-4db9-9f43-cb78a9b29217]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.651 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.651 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.651 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 1d238697-f844-4698-9f1c-19ed6cd73eb8 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.651 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.652 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[462ae240-6323-4b1c-9b96-e21d43ef2f24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.652 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.652 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[43839ac8-9b6d-414a-92b0-7921a7727e5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.653 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-1d238697-f844-4698-9f1c-19ed6cd73eb8
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 1d238697-f844-4698-9f1c-19ed6cd73eb8
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:28:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:23.653 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'env', 'PROCESS_TAG=haproxy-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d238697-f844-4698-9f1c-19ed6cd73eb8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:28:23 compute-0 nova_compute[186241]: 2025-11-25 06:28:23.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:23 compute-0 podman[215979]: 2025-11-25 06:28:23.936056111 +0000 UTC m=+0.037437931 container create 50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 25 06:28:23 compute-0 systemd[1]: Started libpod-conmon-50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe.scope.
Nov 25 06:28:23 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:28:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/801d11e734e436409cb19af42a3dcce8a846e1ea31ef725002cc7f0205e31236/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:28:23 compute-0 podman[215979]: 2025-11-25 06:28:23.977136919 +0000 UTC m=+0.078518750 container init 50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0)
Nov 25 06:28:23 compute-0 podman[215979]: 2025-11-25 06:28:23.982155383 +0000 UTC m=+0.083537203 container start 50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 25 06:28:23 compute-0 podman[215979]: 2025-11-25 06:28:23.918707266 +0000 UTC m=+0.020089106 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:28:23 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[215992]: [NOTICE]   (215996) : New worker (215998) forked
Nov 25 06:28:23 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[215992]: [NOTICE]   (215996) : Loading success.
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.170 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.415 186245 DEBUG nova.compute.manager [req-a3630ccf-187d-474b-8971-904bf55f8c16 req-66aa848c-3753-4517-888f-091d39f7f217 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.415 186245 DEBUG oslo_concurrency.lockutils [req-a3630ccf-187d-474b-8971-904bf55f8c16 req-66aa848c-3753-4517-888f-091d39f7f217 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.416 186245 DEBUG oslo_concurrency.lockutils [req-a3630ccf-187d-474b-8971-904bf55f8c16 req-66aa848c-3753-4517-888f-091d39f7f217 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.416 186245 DEBUG oslo_concurrency.lockutils [req-a3630ccf-187d-474b-8971-904bf55f8c16 req-66aa848c-3753-4517-888f-091d39f7f217 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.416 186245 DEBUG nova.compute.manager [req-a3630ccf-187d-474b-8971-904bf55f8c16 req-66aa848c-3753-4517-888f-091d39f7f217 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Processing event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.417 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.420 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.422 186245 INFO nova.virt.libvirt.driver [-] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Instance spawned successfully.
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.422 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.932 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.932 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.932 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.933 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.933 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:28:24 compute-0 nova_compute[186241]: 2025-11-25 06:28:24.933 186245 DEBUG nova.virt.libvirt.driver [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:28:25 compute-0 podman[216003]: 2025-11-25 06:28:25.064834162 +0000 UTC m=+0.045726780 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:28:25 compute-0 nova_compute[186241]: 2025-11-25 06:28:25.441 186245 INFO nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Took 14.88 seconds to spawn the instance on the hypervisor.
Nov 25 06:28:25 compute-0 nova_compute[186241]: 2025-11-25 06:28:25.442 186245 DEBUG nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:28:25 compute-0 nova_compute[186241]: 2025-11-25 06:28:25.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:25 compute-0 nova_compute[186241]: 2025-11-25 06:28:25.931 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:28:25 compute-0 nova_compute[186241]: 2025-11-25 06:28:25.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:25 compute-0 nova_compute[186241]: 2025-11-25 06:28:25.953 186245 INFO nova.compute.manager [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Took 20.00 seconds to build instance.
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.441 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.455 186245 DEBUG oslo_concurrency.lockutils [None req-f63aaee0-29a4-4375-ae1b-2066a04dc7ad 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.612 186245 DEBUG nova.compute.manager [req-d8ecc7b3-155e-4b94-adc4-a330439e325a req-62fcaf16-480e-4996-8032-ca2ab60baf1d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.612 186245 DEBUG oslo_concurrency.lockutils [req-d8ecc7b3-155e-4b94-adc4-a330439e325a req-62fcaf16-480e-4996-8032-ca2ab60baf1d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.613 186245 DEBUG oslo_concurrency.lockutils [req-d8ecc7b3-155e-4b94-adc4-a330439e325a req-62fcaf16-480e-4996-8032-ca2ab60baf1d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.613 186245 DEBUG oslo_concurrency.lockutils [req-d8ecc7b3-155e-4b94-adc4-a330439e325a req-62fcaf16-480e-4996-8032-ca2ab60baf1d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.613 186245 DEBUG nova.compute.manager [req-d8ecc7b3-155e-4b94-adc4-a330439e325a req-62fcaf16-480e-4996-8032-ca2ab60baf1d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] No waiting events found dispatching network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:28:26 compute-0 nova_compute[186241]: 2025-11-25 06:28:26.613 186245 WARNING nova.compute.manager [req-d8ecc7b3-155e-4b94-adc4-a330439e325a req-62fcaf16-480e-4996-8032-ca2ab60baf1d a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received unexpected event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e for instance with vm_state active and task_state None.
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.035 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.468 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.524 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.525 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.580 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.786 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.787 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5610MB free_disk=73.01715850830078GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.787 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:27 compute-0 nova_compute[186241]: 2025-11-25 06:28:27.787 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:28 compute-0 nova_compute[186241]: 2025-11-25 06:28:28.826 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 9d4b7e67-a66f-4e43-9fac-512ecfb6735f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:28:28 compute-0 nova_compute[186241]: 2025-11-25 06:28:28.826 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:28:28 compute-0 nova_compute[186241]: 2025-11-25 06:28:28.827 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:28:28 compute-0 nova_compute[186241]: 2025-11-25 06:28:28.865 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:28:29 compute-0 nova_compute[186241]: 2025-11-25 06:28:29.172 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:29 compute-0 nova_compute[186241]: 2025-11-25 06:28:29.370 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:28:29 compute-0 nova_compute[186241]: 2025-11-25 06:28:29.876 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:28:29 compute-0 nova_compute[186241]: 2025-11-25 06:28:29.876 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:30 compute-0 nova_compute[186241]: 2025-11-25 06:28:30.876 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:30 compute-0 nova_compute[186241]: 2025-11-25 06:28:30.877 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:30 compute-0 nova_compute[186241]: 2025-11-25 06:28:30.877 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:28:31 compute-0 ovn_controller[95135]: 2025-11-25T06:28:31Z|00127|binding|INFO|Releasing lport 8b4b1775-6249-4ffb-bb90-6cf9cfca84ca from this chassis (sb_readonly=0)
Nov 25 06:28:31 compute-0 NetworkManager[55345]: <info>  [1764052111.9588] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Nov 25 06:28:31 compute-0 NetworkManager[55345]: <info>  [1764052111.9595] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Nov 25 06:28:31 compute-0 nova_compute[186241]: 2025-11-25 06:28:31.959 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:31 compute-0 ovn_controller[95135]: 2025-11-25T06:28:31Z|00128|binding|INFO|Releasing lport 8b4b1775-6249-4ffb-bb90-6cf9cfca84ca from this chassis (sb_readonly=0)
Nov 25 06:28:31 compute-0 nova_compute[186241]: 2025-11-25 06:28:31.991 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:31 compute-0 nova_compute[186241]: 2025-11-25 06:28:31.994 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:32 compute-0 nova_compute[186241]: 2025-11-25 06:28:32.036 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:32 compute-0 nova_compute[186241]: 2025-11-25 06:28:32.637 186245 DEBUG nova.compute.manager [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received event network-changed-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:28:32 compute-0 nova_compute[186241]: 2025-11-25 06:28:32.637 186245 DEBUG nova.compute.manager [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Refreshing instance network info cache due to event network-changed-3430a31e-7faf-4e40-951a-5767c915e85e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:28:32 compute-0 nova_compute[186241]: 2025-11-25 06:28:32.638 186245 DEBUG oslo_concurrency.lockutils [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:28:32 compute-0 nova_compute[186241]: 2025-11-25 06:28:32.638 186245 DEBUG oslo_concurrency.lockutils [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:28:32 compute-0 nova_compute[186241]: 2025-11-25 06:28:32.638 186245 DEBUG nova.network.neutron [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Refreshing network info cache for port 3430a31e-7faf-4e40-951a-5767c915e85e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:28:33 compute-0 podman[216033]: 2025-11-25 06:28:33.084046425 +0000 UTC m=+0.063746332 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.335 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.337 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.337 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.338 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.338 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.339 186245 INFO nova.compute.manager [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Terminating instance
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.842 186245 DEBUG nova.compute.manager [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:28:33 compute-0 kernel: tap3430a31e-7f (unregistering): left promiscuous mode
Nov 25 06:28:33 compute-0 NetworkManager[55345]: <info>  [1764052113.8644] device (tap3430a31e-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:28:33 compute-0 ovn_controller[95135]: 2025-11-25T06:28:33Z|00129|binding|INFO|Releasing lport 3430a31e-7faf-4e40-951a-5767c915e85e from this chassis (sb_readonly=0)
Nov 25 06:28:33 compute-0 ovn_controller[95135]: 2025-11-25T06:28:33Z|00130|binding|INFO|Setting lport 3430a31e-7faf-4e40-951a-5767c915e85e down in Southbound
Nov 25 06:28:33 compute-0 ovn_controller[95135]: 2025-11-25T06:28:33Z|00131|binding|INFO|Removing iface tap3430a31e-7f ovn-installed in OVS
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.869 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:33.872 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:54:ff 10.100.0.12'], port_security=['fa:16:3e:f1:54:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9d4b7e67-a66f-4e43-9fac-512ecfb6735f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dafbde12-3514-4e2d-980f-9529576187d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ac93797-d190-4534-9cfc-8a64cabfa9fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=3430a31e-7faf-4e40-951a-5767c915e85e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:28:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:33.873 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 3430a31e-7faf-4e40-951a-5767c915e85e in datapath 1d238697-f844-4698-9f1c-19ed6cd73eb8 unbound from our chassis
Nov 25 06:28:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:33.874 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d238697-f844-4698-9f1c-19ed6cd73eb8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:28:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:33.879 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9b3a0e-482c-4aff-af9d-d4b4c58ab977]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:33 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:33.880 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 namespace which is not needed anymore
Nov 25 06:28:33 compute-0 nova_compute[186241]: 2025-11-25 06:28:33.887 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:33 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 25 06:28:33 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 9.620s CPU time.
Nov 25 06:28:33 compute-0 systemd-machined[152921]: Machine qemu-8-instance-00000008 terminated.
Nov 25 06:28:33 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[215992]: [NOTICE]   (215996) : haproxy version is 2.8.14-c23fe91
Nov 25 06:28:33 compute-0 podman[216081]: 2025-11-25 06:28:33.961758346 +0000 UTC m=+0.019789499 container kill 50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:28:33 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[215992]: [NOTICE]   (215996) : path to executable is /usr/sbin/haproxy
Nov 25 06:28:33 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[215992]: [WARNING]  (215996) : Exiting Master process...
Nov 25 06:28:33 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[215992]: [ALERT]    (215996) : Current worker (215998) exited with code 143 (Terminated)
Nov 25 06:28:33 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[215992]: [WARNING]  (215996) : All workers exited. Exiting... (0)
Nov 25 06:28:33 compute-0 systemd[1]: libpod-50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe.scope: Deactivated successfully.
Nov 25 06:28:33 compute-0 podman[216093]: 2025-11-25 06:28:33.991911217 +0000 UTC m=+0.015338977 container died 50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:28:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe-userdata-shm.mount: Deactivated successfully.
Nov 25 06:28:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-801d11e734e436409cb19af42a3dcce8a846e1ea31ef725002cc7f0205e31236-merged.mount: Deactivated successfully.
Nov 25 06:28:34 compute-0 podman[216093]: 2025-11-25 06:28:34.011449933 +0000 UTC m=+0.034877683 container cleanup 50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:28:34 compute-0 systemd[1]: libpod-conmon-50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe.scope: Deactivated successfully.
Nov 25 06:28:34 compute-0 podman[216094]: 2025-11-25 06:28:34.01886228 +0000 UTC m=+0.041216586 container remove 50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.022 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[10fa4509-0dcd-4bce-8ad8-2ad6b14c6925]: (4, ("Tue Nov 25 06:28:33 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 (50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe)\n50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe\nTue Nov 25 06:28:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 (50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe)\n50df387a6ecaeb8c05ec3d9006e9236acc5219313dcb2e7949cf627bc411f5fe\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.023 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[69d317e0-cdba-4faa-8f8d-c98c86184ab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.023 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.023 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[755de36e-b12f-4a04-87d5-a8c1ee8d40b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.024 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d238697-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.027 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 kernel: tap1d238697-f0: left promiscuous mode
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.039 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.041 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.043 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8338d819-d2cc-471b-8b3d-3c827ddaea8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.052 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e3cbe345-69a2-4e7a-a5c1-667f4f1b5599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.053 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5ecefa89-2897-4a7f-ab80-78eb60d80dee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.064 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[719dc0b6-e707-4e8f-a620-24df62f5e853]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309188, 'reachable_time': 36262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216126, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d1d238697\x2df844\x2d4698\x2d9f1c\x2d19ed6cd73eb8.mount: Deactivated successfully.
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.067 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:28:34 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:34.067 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[0e06d5c7-1d49-42d8-a3bc-085a3f54714f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.077 186245 INFO nova.virt.libvirt.driver [-] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Instance destroyed successfully.
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.078 186245 DEBUG nova.objects.instance [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 9d4b7e67-a66f-4e43-9fac-512ecfb6735f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.173 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.581 186245 DEBUG nova.virt.libvirt.vif [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:28:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-330519817',display_name='tempest-TestNetworkBasicOps-server-330519817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-330519817',id=8,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMSJey103Q0F2kp7X2sove4lipBsQ5vCuSrfn3Kx/yMSoOS9p+VfHjfGFVVtCd2mWHqAbICUsQ92fAP87X+wL+17Ciim1qS3aDJxN3Q4K6/UGI2BXPTdghfszIFZpVkZkg==',key_name='tempest-TestNetworkBasicOps-530301255',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:28:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-w0qs73q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:28:25Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=9d4b7e67-a66f-4e43-9fac-512ecfb6735f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.582 186245 DEBUG nova.network.os_vif_util [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.582 186245 DEBUG nova.network.os_vif_util [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.582 186245 DEBUG os_vif [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.584 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.584 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3430a31e-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.586 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.587 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.587 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.587 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e8375fe8-80cc-44a2-b1e3-c7217060e1ef) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.588 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.589 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.590 186245 INFO os_vif [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f')
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.591 186245 INFO nova.virt.libvirt.driver [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Deleting instance files /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f_del
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.591 186245 INFO nova.virt.libvirt.driver [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Deletion of /var/lib/nova/instances/9d4b7e67-a66f-4e43-9fac-512ecfb6735f_del complete
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.782 186245 DEBUG nova.compute.manager [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received event network-vif-unplugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.782 186245 DEBUG oslo_concurrency.lockutils [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.782 186245 DEBUG oslo_concurrency.lockutils [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.782 186245 DEBUG oslo_concurrency.lockutils [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.782 186245 DEBUG nova.compute.manager [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] No waiting events found dispatching network-vif-unplugged-3430a31e-7faf-4e40-951a-5767c915e85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.783 186245 DEBUG nova.compute.manager [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received event network-vif-unplugged-3430a31e-7faf-4e40-951a-5767c915e85e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.783 186245 DEBUG nova.compute.manager [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.783 186245 DEBUG oslo_concurrency.lockutils [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.783 186245 DEBUG oslo_concurrency.lockutils [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.783 186245 DEBUG oslo_concurrency.lockutils [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.783 186245 DEBUG nova.compute.manager [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] No waiting events found dispatching network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:28:34 compute-0 nova_compute[186241]: 2025-11-25 06:28:34.783 186245 WARNING nova.compute.manager [req-96f31bd4-0407-4585-8471-6a8239dfef1a req-25804b2b-f269-4332-9cf2-7e46aed26fa9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Received unexpected event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e for instance with vm_state active and task_state deleting.
Nov 25 06:28:35 compute-0 nova_compute[186241]: 2025-11-25 06:28:35.098 186245 INFO nova.compute.manager [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Took 1.25 seconds to destroy the instance on the hypervisor.
Nov 25 06:28:35 compute-0 nova_compute[186241]: 2025-11-25 06:28:35.098 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:28:35 compute-0 nova_compute[186241]: 2025-11-25 06:28:35.098 186245 DEBUG nova.compute.manager [-] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:28:35 compute-0 nova_compute[186241]: 2025-11-25 06:28:35.099 186245 DEBUG nova.network.neutron [-] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:28:37 compute-0 nova_compute[186241]: 2025-11-25 06:28:37.037 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:37 compute-0 podman[216140]: 2025-11-25 06:28:37.068518469 +0000 UTC m=+0.041810164 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:28:37 compute-0 podman[216141]: 2025-11-25 06:28:37.068967256 +0000 UTC m=+0.040809999 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:28:37 compute-0 nova_compute[186241]: 2025-11-25 06:28:37.290 186245 DEBUG nova.network.neutron [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Updated VIF entry in instance network info cache for port 3430a31e-7faf-4e40-951a-5767c915e85e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:28:37 compute-0 nova_compute[186241]: 2025-11-25 06:28:37.291 186245 DEBUG nova.network.neutron [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Updating instance_info_cache with network_info: [{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:28:37 compute-0 nova_compute[186241]: 2025-11-25 06:28:37.794 186245 DEBUG oslo_concurrency.lockutils [req-7832e88d-7eb6-430c-ab1e-55c5f68bcbd4 req-3b225d03-c0ef-4dde-90dc-eb91ee01cf8b a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-9d4b7e67-a66f-4e43-9fac-512ecfb6735f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:28:38 compute-0 nova_compute[186241]: 2025-11-25 06:28:38.013 186245 DEBUG nova.network.neutron [-] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:28:38 compute-0 nova_compute[186241]: 2025-11-25 06:28:38.515 186245 INFO nova.compute.manager [-] [instance: 9d4b7e67-a66f-4e43-9fac-512ecfb6735f] Took 3.42 seconds to deallocate network for instance.
Nov 25 06:28:39 compute-0 nova_compute[186241]: 2025-11-25 06:28:39.021 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:39 compute-0 nova_compute[186241]: 2025-11-25 06:28:39.022 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:39 compute-0 nova_compute[186241]: 2025-11-25 06:28:39.068 186245 DEBUG nova.compute.provider_tree [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:28:39 compute-0 nova_compute[186241]: 2025-11-25 06:28:39.572 186245 DEBUG nova.scheduler.client.report [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:28:39 compute-0 nova_compute[186241]: 2025-11-25 06:28:39.588 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:40 compute-0 nova_compute[186241]: 2025-11-25 06:28:40.077 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:40 compute-0 nova_compute[186241]: 2025-11-25 06:28:40.097 186245 INFO nova.scheduler.client.report [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 9d4b7e67-a66f-4e43-9fac-512ecfb6735f
Nov 25 06:28:41 compute-0 podman[216178]: 2025-11-25 06:28:41.052496102 +0000 UTC m=+0.032496021 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 25 06:28:41 compute-0 nova_compute[186241]: 2025-11-25 06:28:41.106 186245 DEBUG oslo_concurrency.lockutils [None req-96002bb2-e644-470a-a26b-58eb1094988a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "9d4b7e67-a66f-4e43-9fac-512ecfb6735f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:42 compute-0 nova_compute[186241]: 2025-11-25 06:28:42.040 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:44 compute-0 nova_compute[186241]: 2025-11-25 06:28:44.589 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:46 compute-0 podman[216194]: 2025-11-25 06:28:46.084197535 +0000 UTC m=+0.062577256 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Nov 25 06:28:47 compute-0 nova_compute[186241]: 2025-11-25 06:28:47.041 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:47.682 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:47.682 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:28:47.682 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:49 compute-0 nova_compute[186241]: 2025-11-25 06:28:49.590 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:51 compute-0 nova_compute[186241]: 2025-11-25 06:28:51.803 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:51 compute-0 nova_compute[186241]: 2025-11-25 06:28:51.804 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:52 compute-0 nova_compute[186241]: 2025-11-25 06:28:52.042 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:52 compute-0 podman[216213]: 2025-11-25 06:28:52.066978964 +0000 UTC m=+0.045506836 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 25 06:28:52 compute-0 nova_compute[186241]: 2025-11-25 06:28:52.307 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:28:52 compute-0 nova_compute[186241]: 2025-11-25 06:28:52.838 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:52 compute-0 nova_compute[186241]: 2025-11-25 06:28:52.838 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:52 compute-0 nova_compute[186241]: 2025-11-25 06:28:52.844 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:28:52 compute-0 nova_compute[186241]: 2025-11-25 06:28:52.845 186245 INFO nova.compute.claims [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:28:53 compute-0 nova_compute[186241]: 2025-11-25 06:28:53.882 186245 DEBUG nova.compute.provider_tree [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:28:54 compute-0 nova_compute[186241]: 2025-11-25 06:28:54.385 186245 DEBUG nova.scheduler.client.report [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:28:54 compute-0 nova_compute[186241]: 2025-11-25 06:28:54.592 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:54 compute-0 nova_compute[186241]: 2025-11-25 06:28:54.891 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:54 compute-0 nova_compute[186241]: 2025-11-25 06:28:54.891 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:28:55 compute-0 nova_compute[186241]: 2025-11-25 06:28:55.397 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:28:55 compute-0 nova_compute[186241]: 2025-11-25 06:28:55.397 186245 DEBUG nova.network.neutron [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:28:55 compute-0 nova_compute[186241]: 2025-11-25 06:28:55.902 186245 INFO nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:28:56 compute-0 podman[216230]: 2025-11-25 06:28:56.08522535 +0000 UTC m=+0.064555377 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:28:56 compute-0 nova_compute[186241]: 2025-11-25 06:28:56.406 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:28:56 compute-0 nova_compute[186241]: 2025-11-25 06:28:56.666 186245 DEBUG nova.policy [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.044 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.416 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.417 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.417 186245 INFO nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Creating image(s)
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.417 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.418 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.418 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.419 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.422 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.422 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.467 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.468 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.468 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.469 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.472 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.473 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.516 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.517 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.537 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.537 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.538 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.581 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.582 186245 DEBUG nova.virt.disk.api [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.582 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.624 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.625 186245 DEBUG nova.virt.disk.api [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.625 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.626 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Ensure instance console log exists: /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.626 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.626 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.627 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:28:57 compute-0 nova_compute[186241]: 2025-11-25 06:28:57.981 186245 DEBUG nova.network.neutron [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Successfully updated port: 3430a31e-7faf-4e40-951a-5767c915e85e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:28:58 compute-0 nova_compute[186241]: 2025-11-25 06:28:58.142 186245 DEBUG nova.compute.manager [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received event network-changed-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:28:58 compute-0 nova_compute[186241]: 2025-11-25 06:28:58.142 186245 DEBUG nova.compute.manager [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Refreshing instance network info cache due to event network-changed-3430a31e-7faf-4e40-951a-5767c915e85e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:28:58 compute-0 nova_compute[186241]: 2025-11-25 06:28:58.142 186245 DEBUG oslo_concurrency.lockutils [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-00aa090c-560a-41d0-81f1-858b407a81e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:28:58 compute-0 nova_compute[186241]: 2025-11-25 06:28:58.142 186245 DEBUG oslo_concurrency.lockutils [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-00aa090c-560a-41d0-81f1-858b407a81e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:28:58 compute-0 nova_compute[186241]: 2025-11-25 06:28:58.142 186245 DEBUG nova.network.neutron [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Refreshing network info cache for port 3430a31e-7faf-4e40-951a-5767c915e85e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:28:58 compute-0 nova_compute[186241]: 2025-11-25 06:28:58.485 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-00aa090c-560a-41d0-81f1-858b407a81e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:28:58 compute-0 nova_compute[186241]: 2025-11-25 06:28:58.947 186245 DEBUG nova.network.neutron [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:28:59 compute-0 nova_compute[186241]: 2025-11-25 06:28:59.594 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:01 compute-0 nova_compute[186241]: 2025-11-25 06:29:01.271 186245 DEBUG nova.network.neutron [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:29:01 compute-0 nova_compute[186241]: 2025-11-25 06:29:01.775 186245 DEBUG oslo_concurrency.lockutils [req-d151662e-e267-466b-9437-c8925f882471 req-56be0a5b-a749-4a4e-baf0-aea14632a6f3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-00aa090c-560a-41d0-81f1-858b407a81e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:29:01 compute-0 nova_compute[186241]: 2025-11-25 06:29:01.776 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-00aa090c-560a-41d0-81f1-858b407a81e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:29:01 compute-0 nova_compute[186241]: 2025-11-25 06:29:01.776 186245 DEBUG nova.network.neutron [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:29:02 compute-0 nova_compute[186241]: 2025-11-25 06:29:02.046 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:03 compute-0 nova_compute[186241]: 2025-11-25 06:29:03.277 186245 DEBUG nova.network.neutron [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:29:04 compute-0 podman[216266]: 2025-11-25 06:29:04.073649926 +0000 UTC m=+0.053164727 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:29:04 compute-0 nova_compute[186241]: 2025-11-25 06:29:04.596 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.377 186245 DEBUG nova.network.neutron [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Updating instance_info_cache with network_info: [{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.881 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-00aa090c-560a-41d0-81f1-858b407a81e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.881 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Instance network_info: |[{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.883 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Start _get_guest_xml network_info=[{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.886 186245 WARNING nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.886 186245 DEBUG nova.virt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1638078176', uuid='00aa090c-560a-41d0-81f1-858b407a81e1'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052145.8867185) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.894 186245 DEBUG nova.virt.libvirt.host [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.895 186245 DEBUG nova.virt.libvirt.host [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.897 186245 DEBUG nova.virt.libvirt.host [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.897 186245 DEBUG nova.virt.libvirt.host [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.898 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.898 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.898 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.898 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.899 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.899 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.899 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.899 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.899 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.900 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.900 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.900 186245 DEBUG nova.virt.hardware [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.902 186245 DEBUG nova.virt.libvirt.vif [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1638078176',display_name='tempest-TestNetworkBasicOps-server-1638078176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1638078176',id=9,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/XLF7yR40raD9jQ7djYGBtxFFBzniEKfXdM01FU98ONCx+gzIy0tQcpBYK/0yHgRUDrChGFbxsjyfM/YKxQUo6ZSly2BctH3ubafa30QOf8S7wYwGVVWP0IGIGHlKA7Q==',key_name='tempest-TestNetworkBasicOps-752136947',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-aek07u77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:28:56Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=00aa090c-560a-41d0-81f1-858b407a81e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.903 186245 DEBUG nova.network.os_vif_util [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.903 186245 DEBUG nova.network.os_vif_util [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:29:05 compute-0 nova_compute[186241]: 2025-11-25 06:29:05.904 186245 DEBUG nova.objects.instance [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 00aa090c-560a-41d0-81f1-858b407a81e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.408 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <uuid>00aa090c-560a-41d0-81f1-858b407a81e1</uuid>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <name>instance-00000009</name>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-1638078176</nova:name>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:29:05</nova:creationTime>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         <nova:port uuid="3430a31e-7faf-4e40-951a-5767c915e85e">
Nov 25 06:29:06 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <system>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <entry name="serial">00aa090c-560a-41d0-81f1-858b407a81e1</entry>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <entry name="uuid">00aa090c-560a-41d0-81f1-858b407a81e1</entry>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </system>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <os>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   </os>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <features>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   </features>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk.config"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:f1:54:ff"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <target dev="tap3430a31e-7f"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/console.log" append="off"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <video>
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </video>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:29:06 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:29:06 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:29:06 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:29:06 compute-0 nova_compute[186241]: </domain>
Nov 25 06:29:06 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.409 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Preparing to wait for external event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.409 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.410 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.410 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.410 186245 DEBUG nova.virt.libvirt.vif [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1638078176',display_name='tempest-TestNetworkBasicOps-server-1638078176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1638078176',id=9,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/XLF7yR40raD9jQ7djYGBtxFFBzniEKfXdM01FU98ONCx+gzIy0tQcpBYK/0yHgRUDrChGFbxsjyfM/YKxQUo6ZSly2BctH3ubafa30QOf8S7wYwGVVWP0IGIGHlKA7Q==',key_name='tempest-TestNetworkBasicOps-752136947',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-aek07u77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:28:56Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=00aa090c-560a-41d0-81f1-858b407a81e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.411 186245 DEBUG nova.network.os_vif_util [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.411 186245 DEBUG nova.network.os_vif_util [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.411 186245 DEBUG os_vif [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.412 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.412 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.412 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.413 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.413 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1190fdd6-a6cf-5d58-9d9b-b4ab8a93d073', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.414 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.415 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.416 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.416 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3430a31e-7f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.417 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3430a31e-7f, col_values=(('qos', UUID('640ae5a3-1556-4b9d-9d31-7b6e09edd981')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.417 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3430a31e-7f, col_values=(('external_ids', {'iface-id': '3430a31e-7faf-4e40-951a-5767c915e85e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:54:ff', 'vm-uuid': '00aa090c-560a-41d0-81f1-858b407a81e1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:06 compute-0 NetworkManager[55345]: <info>  [1764052146.4188] manager: (tap3430a31e-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.418 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.420 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.422 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:06 compute-0 nova_compute[186241]: 2025-11-25 06:29:06.422 186245 INFO os_vif [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f')
Nov 25 06:29:07 compute-0 nova_compute[186241]: 2025-11-25 06:29:07.048 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:07 compute-0 nova_compute[186241]: 2025-11-25 06:29:07.946 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:29:07 compute-0 nova_compute[186241]: 2025-11-25 06:29:07.947 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:29:07 compute-0 nova_compute[186241]: 2025-11-25 06:29:07.947 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:f1:54:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:29:07 compute-0 nova_compute[186241]: 2025-11-25 06:29:07.948 186245 INFO nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Using config drive
Nov 25 06:29:08 compute-0 podman[216291]: 2025-11-25 06:29:08.064965886 +0000 UTC m=+0.043438305 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:29:08 compute-0 podman[216292]: 2025-11-25 06:29:08.06496769 +0000 UTC m=+0.042430955 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.280 186245 INFO nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Creating config drive at /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk.config
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.285 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpyfm1tblt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.403 186245 DEBUG oslo_concurrency.processutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpyfm1tblt" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:29:09 compute-0 kernel: tap3430a31e-7f: entered promiscuous mode
Nov 25 06:29:09 compute-0 NetworkManager[55345]: <info>  [1764052149.4422] manager: (tap3430a31e-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Nov 25 06:29:09 compute-0 ovn_controller[95135]: 2025-11-25T06:29:09Z|00132|binding|INFO|Claiming lport 3430a31e-7faf-4e40-951a-5767c915e85e for this chassis.
Nov 25 06:29:09 compute-0 ovn_controller[95135]: 2025-11-25T06:29:09Z|00133|binding|INFO|3430a31e-7faf-4e40-951a-5767c915e85e: Claiming fa:16:3e:f1:54:ff 10.100.0.12
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.444 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.451 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:54:ff 10.100.0.12'], port_security=['fa:16:3e:f1:54:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '00aa090c-560a-41d0-81f1-858b407a81e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dafbde12-3514-4e2d-980f-9529576187d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ac93797-d190-4534-9cfc-8a64cabfa9fd, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=3430a31e-7faf-4e40-951a-5767c915e85e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.452 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 3430a31e-7faf-4e40-951a-5767c915e85e in datapath 1d238697-f844-4698-9f1c-19ed6cd73eb8 bound to our chassis
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.453 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d238697-f844-4698-9f1c-19ed6cd73eb8
Nov 25 06:29:09 compute-0 ovn_controller[95135]: 2025-11-25T06:29:09Z|00134|binding|INFO|Setting lport 3430a31e-7faf-4e40-951a-5767c915e85e ovn-installed in OVS
Nov 25 06:29:09 compute-0 ovn_controller[95135]: 2025-11-25T06:29:09Z|00135|binding|INFO|Setting lport 3430a31e-7faf-4e40-951a-5767c915e85e up in Southbound
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.456 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.460 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.463 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3bbdac66-e854-407a-8e6f-643dc5f7d438]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.464 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d238697-f1 in ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.465 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d238697-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.465 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab74c3d-1f67-4dc4-8936-de6ae8ece259]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.465 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9e72e709-830e-42e2-9cfa-0e223202880e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 systemd-udevd[216348]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:29:09 compute-0 systemd-machined[152921]: New machine qemu-9-instance-00000009.
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.478 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[5fcd8373-40e9-4bfe-9182-605110f270a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Nov 25 06:29:09 compute-0 NetworkManager[55345]: <info>  [1764052149.4827] device (tap3430a31e-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:29:09 compute-0 NetworkManager[55345]: <info>  [1764052149.4836] device (tap3430a31e-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.489 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1c8699-534c-4489-952b-2327c9acb5e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.508 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[f01a4e82-6106-493b-af29-ca4ec54b513d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.511 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ad84482a-b4d0-434a-81a5-ba2cf8f6ca15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 NetworkManager[55345]: <info>  [1764052149.5125] manager: (tap1d238697-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.536 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[70a44d2c-5049-4386-bc7d-b45f21f616f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.539 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[b1670cc4-2299-4d80-916d-034cbf8871ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 NetworkManager[55345]: <info>  [1764052149.5548] device (tap1d238697-f0): carrier: link connected
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.558 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[248560a1-5c01-45a6-9d51-8c0496e6674c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.572 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[81152d82-4218-4846-b719-d70c25612099]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d238697-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:8d:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 313795, 'reachable_time': 41358, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216372, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.582 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[733ef9a0-5dad-45a4-bcd6-98c9ba096d5b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:8d19'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 313795, 'tstamp': 313795}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216373, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.592 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ca528592-a241-46cf-acb3-011a15cdd013]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d238697-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:8d:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 313795, 'reachable_time': 41358, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216374, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.612 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bab2ce50-f3ad-49bf-9b1d-748a64ef4d3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.654 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8fbcd0bb-f5a8-47d1-9804-9e6303a79109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.655 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d238697-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.656 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.656 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d238697-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.657 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:09 compute-0 kernel: tap1d238697-f0: entered promiscuous mode
Nov 25 06:29:09 compute-0 NetworkManager[55345]: <info>  [1764052149.6598] manager: (tap1d238697-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.661 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d238697-f0, col_values=(('external_ids', {'iface-id': '8b4b1775-6249-4ffb-bb90-6cf9cfca84ca'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:09 compute-0 ovn_controller[95135]: 2025-11-25T06:29:09Z|00136|binding|INFO|Releasing lport 8b4b1775-6249-4ffb-bb90-6cf9cfca84ca from this chassis (sb_readonly=0)
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.661 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.663 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[86d24b69-b330-4b84-a2ce-943ae53be2d2]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.664 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.664 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.664 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 1d238697-f844-4698-9f1c-19ed6cd73eb8 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.665 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.665 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[509ca146-e56b-4bbf-9133-1638064ac881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.665 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.666 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[cd639da8-daf1-4352-8011-c2143d266c75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.666 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-1d238697-f844-4698-9f1c-19ed6cd73eb8
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 1d238697-f844-4698-9f1c-19ed6cd73eb8
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:29:09 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:09.668 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'env', 'PROCESS_TAG=haproxy-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d238697-f844-4698-9f1c-19ed6cd73eb8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.674 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.730 186245 DEBUG nova.compute.manager [req-ed07e020-1242-4f57-b353-0d26e8342fa1 req-7b94f6a5-9dca-4989-863f-15d231a83971 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.731 186245 DEBUG oslo_concurrency.lockutils [req-ed07e020-1242-4f57-b353-0d26e8342fa1 req-7b94f6a5-9dca-4989-863f-15d231a83971 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.731 186245 DEBUG oslo_concurrency.lockutils [req-ed07e020-1242-4f57-b353-0d26e8342fa1 req-7b94f6a5-9dca-4989-863f-15d231a83971 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.731 186245 DEBUG oslo_concurrency.lockutils [req-ed07e020-1242-4f57-b353-0d26e8342fa1 req-7b94f6a5-9dca-4989-863f-15d231a83971 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:09 compute-0 nova_compute[186241]: 2025-11-25 06:29:09.732 186245 DEBUG nova.compute.manager [req-ed07e020-1242-4f57-b353-0d26e8342fa1 req-7b94f6a5-9dca-4989-863f-15d231a83971 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Processing event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:29:09 compute-0 podman[216404]: 2025-11-25 06:29:09.944784536 +0000 UTC m=+0.030005053 container create 3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:29:09 compute-0 systemd[1]: Started libpod-conmon-3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa.scope.
Nov 25 06:29:09 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:29:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f7192bc6048b3789fedaa3f77645ddee6acfae708739243b763c1d2d29ad577/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:29:09 compute-0 podman[216404]: 2025-11-25 06:29:09.99589624 +0000 UTC m=+0.081116768 container init 3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 25 06:29:10 compute-0 podman[216404]: 2025-11-25 06:29:10.001435276 +0000 UTC m=+0.086655792 container start 3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 06:29:10 compute-0 podman[216404]: 2025-11-25 06:29:09.931806352 +0000 UTC m=+0.017026889 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:29:10 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[216416]: [NOTICE]   (216420) : New worker (216422) forked
Nov 25 06:29:10 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[216416]: [NOTICE]   (216420) : Loading success.
Nov 25 06:29:10 compute-0 nova_compute[186241]: 2025-11-25 06:29:10.615 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:29:10 compute-0 nova_compute[186241]: 2025-11-25 06:29:10.619 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:29:10 compute-0 nova_compute[186241]: 2025-11-25 06:29:10.621 186245 INFO nova.virt.libvirt.driver [-] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Instance spawned successfully.
Nov 25 06:29:10 compute-0 nova_compute[186241]: 2025-11-25 06:29:10.621 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.131 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.132 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.132 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.132 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.133 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.133 186245 DEBUG nova.virt.libvirt.driver [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.419 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.638 186245 INFO nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Took 14.22 seconds to spawn the instance on the hypervisor.
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.639 186245 DEBUG nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.886 186245 DEBUG nova.compute.manager [req-d8ae4496-7f0d-4577-a453-719314c58528 req-0a89fac6-3ed0-491f-9c90-b3c28b984daa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.886 186245 DEBUG oslo_concurrency.lockutils [req-d8ae4496-7f0d-4577-a453-719314c58528 req-0a89fac6-3ed0-491f-9c90-b3c28b984daa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.886 186245 DEBUG oslo_concurrency.lockutils [req-d8ae4496-7f0d-4577-a453-719314c58528 req-0a89fac6-3ed0-491f-9c90-b3c28b984daa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.887 186245 DEBUG oslo_concurrency.lockutils [req-d8ae4496-7f0d-4577-a453-719314c58528 req-0a89fac6-3ed0-491f-9c90-b3c28b984daa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.887 186245 DEBUG nova.compute.manager [req-d8ae4496-7f0d-4577-a453-719314c58528 req-0a89fac6-3ed0-491f-9c90-b3c28b984daa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] No waiting events found dispatching network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:29:11 compute-0 nova_compute[186241]: 2025-11-25 06:29:11.887 186245 WARNING nova.compute.manager [req-d8ae4496-7f0d-4577-a453-719314c58528 req-0a89fac6-3ed0-491f-9c90-b3c28b984daa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received unexpected event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e for instance with vm_state active and task_state None.
Nov 25 06:29:12 compute-0 nova_compute[186241]: 2025-11-25 06:29:12.049 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:12 compute-0 podman[216434]: 2025-11-25 06:29:12.062845874 +0000 UTC m=+0.042158091 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:29:12 compute-0 nova_compute[186241]: 2025-11-25 06:29:12.151 186245 INFO nova.compute.manager [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Took 19.34 seconds to build instance.
Nov 25 06:29:12 compute-0 nova_compute[186241]: 2025-11-25 06:29:12.653 186245 DEBUG oslo_concurrency.lockutils [None req-1ae2eaf5-0d5d-44d4-ab47-16cdc7623bd0 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.849s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:16 compute-0 nova_compute[186241]: 2025-11-25 06:29:16.422 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:17 compute-0 nova_compute[186241]: 2025-11-25 06:29:17.050 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:17 compute-0 podman[216450]: 2025-11-25 06:29:17.064620684 +0000 UTC m=+0.039035331 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.200 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.203 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.203 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.204 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.204 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.205 186245 INFO nova.compute.manager [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Terminating instance
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.708 186245 DEBUG nova.compute.manager [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:29:18 compute-0 kernel: tap3430a31e-7f (unregistering): left promiscuous mode
Nov 25 06:29:18 compute-0 NetworkManager[55345]: <info>  [1764052158.7204] device (tap3430a31e-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:29:18 compute-0 ovn_controller[95135]: 2025-11-25T06:29:18Z|00137|binding|INFO|Releasing lport 3430a31e-7faf-4e40-951a-5767c915e85e from this chassis (sb_readonly=0)
Nov 25 06:29:18 compute-0 ovn_controller[95135]: 2025-11-25T06:29:18Z|00138|binding|INFO|Setting lport 3430a31e-7faf-4e40-951a-5767c915e85e down in Southbound
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.725 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:18 compute-0 ovn_controller[95135]: 2025-11-25T06:29:18Z|00139|binding|INFO|Removing iface tap3430a31e-7f ovn-installed in OVS
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.726 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.738 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:54:ff 10.100.0.12'], port_security=['fa:16:3e:f1:54:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '00aa090c-560a-41d0-81f1-858b407a81e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1919168084', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'dafbde12-3514-4e2d-980f-9529576187d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ac93797-d190-4534-9cfc-8a64cabfa9fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=3430a31e-7faf-4e40-951a-5767c915e85e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.739 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 3430a31e-7faf-4e40-951a-5767c915e85e in datapath 1d238697-f844-4698-9f1c-19ed6cd73eb8 unbound from our chassis
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.741 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d238697-f844-4698-9f1c-19ed6cd73eb8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.741 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e7052f2a-fce2-45b2-b382-fe679aa4c028]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.742 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 namespace which is not needed anymore
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.745 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:18 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 25 06:29:18 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 9.298s CPU time.
Nov 25 06:29:18 compute-0 systemd-machined[152921]: Machine qemu-9-instance-00000009 terminated.
Nov 25 06:29:18 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[216416]: [NOTICE]   (216420) : haproxy version is 2.8.14-c23fe91
Nov 25 06:29:18 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[216416]: [NOTICE]   (216420) : path to executable is /usr/sbin/haproxy
Nov 25 06:29:18 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[216416]: [WARNING]  (216420) : Exiting Master process...
Nov 25 06:29:18 compute-0 podman[216490]: 2025-11-25 06:29:18.827327208 +0000 UTC m=+0.021122738 container kill 3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 06:29:18 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[216416]: [ALERT]    (216420) : Current worker (216422) exited with code 143 (Terminated)
Nov 25 06:29:18 compute-0 neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8[216416]: [WARNING]  (216420) : All workers exited. Exiting... (0)
Nov 25 06:29:18 compute-0 systemd[1]: libpod-3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa.scope: Deactivated successfully.
Nov 25 06:29:18 compute-0 conmon[216416]: conmon 3e9decec87d31f0a8c55 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa.scope/container/memory.events
Nov 25 06:29:18 compute-0 podman[216502]: 2025-11-25 06:29:18.860891719 +0000 UTC m=+0.017972390 container died 3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0)
Nov 25 06:29:18 compute-0 podman[216502]: 2025-11-25 06:29:18.877084272 +0000 UTC m=+0.034164933 container cleanup 3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:29:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f7192bc6048b3789fedaa3f77645ddee6acfae708739243b763c1d2d29ad577-merged.mount: Deactivated successfully.
Nov 25 06:29:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa-userdata-shm.mount: Deactivated successfully.
Nov 25 06:29:18 compute-0 systemd[1]: libpod-conmon-3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa.scope: Deactivated successfully.
Nov 25 06:29:18 compute-0 podman[216503]: 2025-11-25 06:29:18.886279937 +0000 UTC m=+0.040499154 container remove 3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.889 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4d439e-bc82-4fb3-b238-e3f2c62b23c7]: (4, ("Tue Nov 25 06:29:18 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 (3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa)\n3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa\nTue Nov 25 06:29:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 (3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa)\n3e9decec87d31f0a8c5596248a6489f2f1896a29d27bcd735e81b2361d5e58aa\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.890 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[2eac1061-b509-4f31-a907-42d49a581f2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.891 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d238697-f844-4698-9f1c-19ed6cd73eb8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.891 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d008c8dc-d44f-4553-88c6-1e5a56cd23a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.892 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d238697-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:18 compute-0 kernel: tap1d238697-f0: left promiscuous mode
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.893 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.907 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.908 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.910 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5a53e9-ae6c-401c-a4d5-88bf9515a730]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.917 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[94122f97-6764-42b1-bca8-8d21c79b02d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.919 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[91b5c18f-d4dc-4f93-b1ff-d8cf3d9c53ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.932 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[de339cfd-19fd-498b-89be-3df9b7d953a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 313791, 'reachable_time': 24779, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216534, 'error': None, 'target': 'ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d1d238697\x2df844\x2d4698\x2d9f1c\x2d19ed6cd73eb8.mount: Deactivated successfully.
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.935 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1d238697-f844-4698-9f1c-19ed6cd73eb8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.935 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9a641a-5f76-4781-b976-29e8d19a1195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:18 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:18.936 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.944 186245 INFO nova.virt.libvirt.driver [-] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Instance destroyed successfully.
Nov 25 06:29:18 compute-0 nova_compute[186241]: 2025-11-25 06:29:18.945 186245 DEBUG nova.objects.instance [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 00aa090c-560a-41d0-81f1-858b407a81e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.002 186245 DEBUG nova.compute.manager [req-d4a1ade3-7b79-4c8b-b807-67cf12ac0ecc req-2ae9a584-1cc0-455c-b5f2-413a2313b634 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received event network-vif-unplugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.002 186245 DEBUG oslo_concurrency.lockutils [req-d4a1ade3-7b79-4c8b-b807-67cf12ac0ecc req-2ae9a584-1cc0-455c-b5f2-413a2313b634 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.002 186245 DEBUG oslo_concurrency.lockutils [req-d4a1ade3-7b79-4c8b-b807-67cf12ac0ecc req-2ae9a584-1cc0-455c-b5f2-413a2313b634 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.003 186245 DEBUG oslo_concurrency.lockutils [req-d4a1ade3-7b79-4c8b-b807-67cf12ac0ecc req-2ae9a584-1cc0-455c-b5f2-413a2313b634 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.003 186245 DEBUG nova.compute.manager [req-d4a1ade3-7b79-4c8b-b807-67cf12ac0ecc req-2ae9a584-1cc0-455c-b5f2-413a2313b634 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] No waiting events found dispatching network-vif-unplugged-3430a31e-7faf-4e40-951a-5767c915e85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.003 186245 DEBUG nova.compute.manager [req-d4a1ade3-7b79-4c8b-b807-67cf12ac0ecc req-2ae9a584-1cc0-455c-b5f2-413a2313b634 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received event network-vif-unplugged-3430a31e-7faf-4e40-951a-5767c915e85e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.452 186245 DEBUG nova.virt.libvirt.vif [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1638078176',display_name='tempest-TestNetworkBasicOps-server-1638078176',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1638078176',id=9,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/XLF7yR40raD9jQ7djYGBtxFFBzniEKfXdM01FU98ONCx+gzIy0tQcpBYK/0yHgRUDrChGFbxsjyfM/YKxQUo6ZSly2BctH3ubafa30QOf8S7wYwGVVWP0IGIGHlKA7Q==',key_name='tempest-TestNetworkBasicOps-752136947',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:29:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-aek07u77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:29:11Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=00aa090c-560a-41d0-81f1-858b407a81e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.452 186245 DEBUG nova.network.os_vif_util [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "3430a31e-7faf-4e40-951a-5767c915e85e", "address": "fa:16:3e:f1:54:ff", "network": {"id": "1d238697-f844-4698-9f1c-19ed6cd73eb8", "bridge": "br-int", "label": "tempest-network-smoke--4362652", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3430a31e-7f", "ovs_interfaceid": "3430a31e-7faf-4e40-951a-5767c915e85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.453 186245 DEBUG nova.network.os_vif_util [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.453 186245 DEBUG os_vif [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.454 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.455 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3430a31e-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.457 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.458 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.458 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=640ae5a3-1556-4b9d-9d31-7b6e09edd981) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.459 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.460 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.461 186245 INFO os_vif [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:54:ff,bridge_name='br-int',has_traffic_filtering=True,id=3430a31e-7faf-4e40-951a-5767c915e85e,network=Network(1d238697-f844-4698-9f1c-19ed6cd73eb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3430a31e-7f')
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.461 186245 INFO nova.virt.libvirt.driver [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Deleting instance files /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1_del
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.462 186245 INFO nova.virt.libvirt.driver [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Deletion of /var/lib/nova/instances/00aa090c-560a-41d0-81f1-858b407a81e1_del complete
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.969 186245 INFO nova.compute.manager [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.969 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.969 186245 DEBUG nova.compute.manager [-] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:29:19 compute-0 nova_compute[186241]: 2025-11-25 06:29:19.969 186245 DEBUG nova.network.neutron [-] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:29:21 compute-0 nova_compute[186241]: 2025-11-25 06:29:21.160 186245 DEBUG nova.compute.manager [req-8009bac1-cb1a-4738-8a04-35129e58b8c4 req-bfd2c1e3-c17b-464c-a6b4-9c2dc3679e89 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:29:21 compute-0 nova_compute[186241]: 2025-11-25 06:29:21.160 186245 DEBUG oslo_concurrency.lockutils [req-8009bac1-cb1a-4738-8a04-35129e58b8c4 req-bfd2c1e3-c17b-464c-a6b4-9c2dc3679e89 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:21 compute-0 nova_compute[186241]: 2025-11-25 06:29:21.161 186245 DEBUG oslo_concurrency.lockutils [req-8009bac1-cb1a-4738-8a04-35129e58b8c4 req-bfd2c1e3-c17b-464c-a6b4-9c2dc3679e89 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:21 compute-0 nova_compute[186241]: 2025-11-25 06:29:21.161 186245 DEBUG oslo_concurrency.lockutils [req-8009bac1-cb1a-4738-8a04-35129e58b8c4 req-bfd2c1e3-c17b-464c-a6b4-9c2dc3679e89 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:21 compute-0 nova_compute[186241]: 2025-11-25 06:29:21.161 186245 DEBUG nova.compute.manager [req-8009bac1-cb1a-4738-8a04-35129e58b8c4 req-bfd2c1e3-c17b-464c-a6b4-9c2dc3679e89 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] No waiting events found dispatching network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:29:21 compute-0 nova_compute[186241]: 2025-11-25 06:29:21.161 186245 WARNING nova.compute.manager [req-8009bac1-cb1a-4738-8a04-35129e58b8c4 req-bfd2c1e3-c17b-464c-a6b4-9c2dc3679e89 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Received unexpected event network-vif-plugged-3430a31e-7faf-4e40-951a-5767c915e85e for instance with vm_state active and task_state deleting.
Nov 25 06:29:22 compute-0 nova_compute[186241]: 2025-11-25 06:29:22.016 186245 DEBUG nova.network.neutron [-] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:29:22 compute-0 nova_compute[186241]: 2025-11-25 06:29:22.052 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:22 compute-0 nova_compute[186241]: 2025-11-25 06:29:22.520 186245 INFO nova.compute.manager [-] [instance: 00aa090c-560a-41d0-81f1-858b407a81e1] Took 2.55 seconds to deallocate network for instance.
Nov 25 06:29:22 compute-0 nova_compute[186241]: 2025-11-25 06:29:22.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:23 compute-0 nova_compute[186241]: 2025-11-25 06:29:23.025 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:23 compute-0 nova_compute[186241]: 2025-11-25 06:29:23.025 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:23 compute-0 podman[216547]: 2025-11-25 06:29:23.071909247 +0000 UTC m=+0.046376925 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125)
Nov 25 06:29:23 compute-0 nova_compute[186241]: 2025-11-25 06:29:23.080 186245 DEBUG nova.compute.provider_tree [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:29:23 compute-0 nova_compute[186241]: 2025-11-25 06:29:23.583 186245 DEBUG nova.scheduler.client.report [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:29:23 compute-0 nova_compute[186241]: 2025-11-25 06:29:23.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:24 compute-0 nova_compute[186241]: 2025-11-25 06:29:24.089 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:24 compute-0 nova_compute[186241]: 2025-11-25 06:29:24.114 186245 INFO nova.scheduler.client.report [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 00aa090c-560a-41d0-81f1-858b407a81e1
Nov 25 06:29:24 compute-0 nova_compute[186241]: 2025-11-25 06:29:24.459 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:24 compute-0 nova_compute[186241]: 2025-11-25 06:29:24.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:25 compute-0 nova_compute[186241]: 2025-11-25 06:29:25.123 186245 DEBUG oslo_concurrency.lockutils [None req-91b3e498-0a99-4462-a0b8-c45b776322c4 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "00aa090c-560a-41d0-81f1-858b407a81e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:25 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:25.937 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:29:27 compute-0 nova_compute[186241]: 2025-11-25 06:29:27.052 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:27 compute-0 podman[216564]: 2025-11-25 06:29:27.08184848 +0000 UTC m=+0.060508343 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:29:27 compute-0 nova_compute[186241]: 2025-11-25 06:29:27.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:27 compute-0 nova_compute[186241]: 2025-11-25 06:29:27.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:27 compute-0 nova_compute[186241]: 2025-11-25 06:29:27.931 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:29:27 compute-0 nova_compute[186241]: 2025-11-25 06:29:27.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.439 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.439 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.628 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.629 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5762MB free_disk=73.01791381835938GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.629 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:28 compute-0 nova_compute[186241]: 2025-11-25 06:29:28.629 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.460 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.659 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.659 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.675 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing inventories for resource provider b9b31722-b833-4ea1-a013-247935742e36 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.690 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating ProviderTree inventory for provider b9b31722-b833-4ea1-a013-247935742e36 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.690 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.699 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing aggregate associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.712 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing trait associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX512VPCLMULQDQ,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_ARCH_X86_64,HW_CPU_X86_AMD_SVM,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX512VAES,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX2,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Nov 25 06:29:29 compute-0 nova_compute[186241]: 2025-11-25 06:29:29.725 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:29:30 compute-0 nova_compute[186241]: 2025-11-25 06:29:30.228 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:29:30 compute-0 nova_compute[186241]: 2025-11-25 06:29:30.733 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:29:30 compute-0 nova_compute[186241]: 2025-11-25 06:29:30.734 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:31 compute-0 nova_compute[186241]: 2025-11-25 06:29:31.729 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:31 compute-0 nova_compute[186241]: 2025-11-25 06:29:31.730 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:32 compute-0 nova_compute[186241]: 2025-11-25 06:29:32.054 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:32 compute-0 nova_compute[186241]: 2025-11-25 06:29:32.235 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:29:33 compute-0 nova_compute[186241]: 2025-11-25 06:29:33.627 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:33 compute-0 nova_compute[186241]: 2025-11-25 06:29:33.704 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:34 compute-0 nova_compute[186241]: 2025-11-25 06:29:34.462 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:35 compute-0 podman[216587]: 2025-11-25 06:29:35.079097642 +0000 UTC m=+0.058137834 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:29:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:36.637 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:08:3d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-726f016a-ee65-4a75-be87-3386221dc835', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726f016a-ee65-4a75-be87-3386221dc835', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dcc51a4-c804-4eab-90b5-720685a9ca99, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3480c600-85f3-459e-80b0-348f3e309bfd) old=Port_Binding(mac=['fa:16:3e:8c:08:3d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-726f016a-ee65-4a75-be87-3386221dc835', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726f016a-ee65-4a75-be87-3386221dc835', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:29:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:36.639 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3480c600-85f3-459e-80b0-348f3e309bfd in datapath 726f016a-ee65-4a75-be87-3386221dc835 updated
Nov 25 06:29:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:36.639 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 726f016a-ee65-4a75-be87-3386221dc835, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:29:36 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:36.640 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[23101303-c48b-423c-a0e7-2a1c2729263b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:29:37 compute-0 nova_compute[186241]: 2025-11-25 06:29:37.056 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:39 compute-0 podman[216611]: 2025-11-25 06:29:39.057901355 +0000 UTC m=+0.035814154 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:29:39 compute-0 podman[216610]: 2025-11-25 06:29:39.062895317 +0000 UTC m=+0.041609117 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:29:39 compute-0 nova_compute[186241]: 2025-11-25 06:29:39.465 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:42 compute-0 nova_compute[186241]: 2025-11-25 06:29:42.058 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:43 compute-0 podman[216648]: 2025-11-25 06:29:43.078909624 +0000 UTC m=+0.058859554 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 25 06:29:44 compute-0 nova_compute[186241]: 2025-11-25 06:29:44.467 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:47 compute-0 nova_compute[186241]: 2025-11-25 06:29:47.060 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:47.691 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:47.692 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:29:47.692 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:48 compute-0 podman[216665]: 2025-11-25 06:29:48.08492069 +0000 UTC m=+0.064372286 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter)
Nov 25 06:29:49 compute-0 nova_compute[186241]: 2025-11-25 06:29:49.468 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:50 compute-0 nova_compute[186241]: 2025-11-25 06:29:50.530 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:50 compute-0 nova_compute[186241]: 2025-11-25 06:29:50.531 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:51 compute-0 nova_compute[186241]: 2025-11-25 06:29:51.033 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:29:51 compute-0 nova_compute[186241]: 2025-11-25 06:29:51.565 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:51 compute-0 nova_compute[186241]: 2025-11-25 06:29:51.565 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:51 compute-0 nova_compute[186241]: 2025-11-25 06:29:51.572 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:29:51 compute-0 nova_compute[186241]: 2025-11-25 06:29:51.572 186245 INFO nova.compute.claims [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:29:52 compute-0 nova_compute[186241]: 2025-11-25 06:29:52.062 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:52 compute-0 nova_compute[186241]: 2025-11-25 06:29:52.614 186245 DEBUG nova.compute.provider_tree [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:29:53 compute-0 nova_compute[186241]: 2025-11-25 06:29:53.118 186245 DEBUG nova.scheduler.client.report [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:29:53 compute-0 nova_compute[186241]: 2025-11-25 06:29:53.622 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:53 compute-0 nova_compute[186241]: 2025-11-25 06:29:53.623 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:29:54 compute-0 podman[216684]: 2025-11-25 06:29:54.053876543 +0000 UTC m=+0.036510425 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125)
Nov 25 06:29:54 compute-0 nova_compute[186241]: 2025-11-25 06:29:54.130 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:29:54 compute-0 nova_compute[186241]: 2025-11-25 06:29:54.130 186245 DEBUG nova.network.neutron [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:29:54 compute-0 nova_compute[186241]: 2025-11-25 06:29:54.467 186245 DEBUG nova.policy [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:29:54 compute-0 nova_compute[186241]: 2025-11-25 06:29:54.469 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:54 compute-0 nova_compute[186241]: 2025-11-25 06:29:54.634 186245 INFO nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:29:55 compute-0 nova_compute[186241]: 2025-11-25 06:29:55.137 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:29:55 compute-0 nova_compute[186241]: 2025-11-25 06:29:55.666 186245 DEBUG nova.network.neutron [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Successfully created port: 975b8d2c-d44e-424b-8044-846b81925518 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.146 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.147 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.147 186245 INFO nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Creating image(s)
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.148 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.148 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.149 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.149 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.152 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.153 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.197 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.197 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.198 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.198 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.201 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.202 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.242 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.243 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.260 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.261 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.261 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.302 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.303 186245 DEBUG nova.virt.disk.api [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.303 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.344 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.345 186245 DEBUG nova.virt.disk.api [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.345 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.345 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Ensure instance console log exists: /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.346 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.346 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.346 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.662 186245 DEBUG nova.network.neutron [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Successfully updated port: 975b8d2c-d44e-424b-8044-846b81925518 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.813 186245 DEBUG nova.compute.manager [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-changed-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.813 186245 DEBUG nova.compute.manager [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Refreshing instance network info cache due to event network-changed-975b8d2c-d44e-424b-8044-846b81925518. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.813 186245 DEBUG oslo_concurrency.lockutils [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.813 186245 DEBUG oslo_concurrency.lockutils [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:29:56 compute-0 nova_compute[186241]: 2025-11-25 06:29:56.813 186245 DEBUG nova.network.neutron [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Refreshing network info cache for port 975b8d2c-d44e-424b-8044-846b81925518 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:29:57 compute-0 nova_compute[186241]: 2025-11-25 06:29:57.063 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:57 compute-0 nova_compute[186241]: 2025-11-25 06:29:57.165 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:29:58 compute-0 podman[216716]: 2025-11-25 06:29:58.05186008 +0000 UTC m=+0.032452476 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:29:58 compute-0 nova_compute[186241]: 2025-11-25 06:29:58.152 186245 DEBUG nova.network.neutron [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:29:59 compute-0 nova_compute[186241]: 2025-11-25 06:29:59.268 186245 DEBUG nova.network.neutron [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:29:59 compute-0 nova_compute[186241]: 2025-11-25 06:29:59.470 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.550 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:29:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:29:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:29:59 compute-0 nova_compute[186241]: 2025-11-25 06:29:59.774 186245 DEBUG oslo_concurrency.lockutils [req-76dcbb39-81ea-4548-9d94-6cebf189a6e7 req-17ca7aaa-88b7-4798-99bb-ece69711257e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:29:59 compute-0 nova_compute[186241]: 2025-11-25 06:29:59.774 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:29:59 compute-0 nova_compute[186241]: 2025-11-25 06:29:59.774 186245 DEBUG nova.network.neutron [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:30:00 compute-0 nova_compute[186241]: 2025-11-25 06:30:00.628 186245 DEBUG nova.network.neutron [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.066 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.087 186245 DEBUG nova.network.neutron [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updating instance_info_cache with network_info: [{"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.590 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.591 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Instance network_info: |[{"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.592 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Start _get_guest_xml network_info=[{"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.595 186245 WARNING nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.595 186245 DEBUG nova.virt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1169628092', uuid='38ebec87-f0fc-428a-9751-f97953e7c554'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052202.5957706) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.601 186245 DEBUG nova.virt.libvirt.host [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.601 186245 DEBUG nova.virt.libvirt.host [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.603 186245 DEBUG nova.virt.libvirt.host [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.604 186245 DEBUG nova.virt.libvirt.host [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.604 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.604 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.604 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.605 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.605 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.605 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.605 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.605 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.605 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.606 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.606 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.606 186245 DEBUG nova.virt.hardware [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.608 186245 DEBUG nova.virt.libvirt.vif [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:29:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1169628092',display_name='tempest-TestNetworkBasicOps-server-1169628092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1169628092',id=10,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE2FhW+QNubceZJArZWViP3HHnq1MkVj3LYH+Qb2Y8Y1wEj4tHjkcM4k8WY26rNPXpnbw/RMSTjbF6xLvLrT3mCEwwBuqLYmryFvMYCCAVPPEsKuI63nuHc/l/9LebrMsA==',key_name='tempest-TestNetworkBasicOps-454611679',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-f28024go',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:29:55Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=38ebec87-f0fc-428a-9751-f97953e7c554,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.608 186245 DEBUG nova.network.os_vif_util [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.609 186245 DEBUG nova.network.os_vif_util [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:48:6c,bridge_name='br-int',has_traffic_filtering=True,id=975b8d2c-d44e-424b-8044-846b81925518,network=Network(726f016a-ee65-4a75-be87-3386221dc835),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap975b8d2c-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:30:02 compute-0 nova_compute[186241]: 2025-11-25 06:30:02.610 186245 DEBUG nova.objects.instance [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 38ebec87-f0fc-428a-9751-f97953e7c554 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.114 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <uuid>38ebec87-f0fc-428a-9751-f97953e7c554</uuid>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <name>instance-0000000a</name>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-1169628092</nova:name>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:30:02</nova:creationTime>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         <nova:port uuid="975b8d2c-d44e-424b-8044-846b81925518">
Nov 25 06:30:03 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <system>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <entry name="serial">38ebec87-f0fc-428a-9751-f97953e7c554</entry>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <entry name="uuid">38ebec87-f0fc-428a-9751-f97953e7c554</entry>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </system>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <os>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   </os>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <features>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   </features>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk.config"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:3a:48:6c"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <target dev="tap975b8d2c-d4"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/console.log" append="off"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <video>
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </video>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:30:03 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:30:03 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:30:03 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:30:03 compute-0 nova_compute[186241]: </domain>
Nov 25 06:30:03 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.115 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Preparing to wait for external event network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.115 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.115 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.115 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.116 186245 DEBUG nova.virt.libvirt.vif [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:29:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1169628092',display_name='tempest-TestNetworkBasicOps-server-1169628092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1169628092',id=10,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE2FhW+QNubceZJArZWViP3HHnq1MkVj3LYH+Qb2Y8Y1wEj4tHjkcM4k8WY26rNPXpnbw/RMSTjbF6xLvLrT3mCEwwBuqLYmryFvMYCCAVPPEsKuI63nuHc/l/9LebrMsA==',key_name='tempest-TestNetworkBasicOps-454611679',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-f28024go',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:29:55Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=38ebec87-f0fc-428a-9751-f97953e7c554,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.116 186245 DEBUG nova.network.os_vif_util [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.116 186245 DEBUG nova.network.os_vif_util [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:48:6c,bridge_name='br-int',has_traffic_filtering=True,id=975b8d2c-d44e-424b-8044-846b81925518,network=Network(726f016a-ee65-4a75-be87-3386221dc835),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap975b8d2c-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.117 186245 DEBUG os_vif [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:48:6c,bridge_name='br-int',has_traffic_filtering=True,id=975b8d2c-d44e-424b-8044-846b81925518,network=Network(726f016a-ee65-4a75-be87-3386221dc835),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap975b8d2c-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.117 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.117 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.117 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.118 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.118 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '37f9819f-f2ed-5c95-a91c-26c8d6d7fe08', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.120 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.121 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.122 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap975b8d2c-d4, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.122 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap975b8d2c-d4, col_values=(('qos', UUID('32c9e002-068d-43b1-9abb-8b0949dd3eb9')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.122 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap975b8d2c-d4, col_values=(('external_ids', {'iface-id': '975b8d2c-d44e-424b-8044-846b81925518', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:48:6c', 'vm-uuid': '38ebec87-f0fc-428a-9751-f97953e7c554'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:03 compute-0 NetworkManager[55345]: <info>  [1764052203.1245] manager: (tap975b8d2c-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.125 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.126 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:03 compute-0 nova_compute[186241]: 2025-11-25 06:30:03.127 186245 INFO os_vif [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:48:6c,bridge_name='br-int',has_traffic_filtering=True,id=975b8d2c-d44e-424b-8044-846b81925518,network=Network(726f016a-ee65-4a75-be87-3386221dc835),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap975b8d2c-d4')
Nov 25 06:30:04 compute-0 nova_compute[186241]: 2025-11-25 06:30:04.647 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:30:04 compute-0 nova_compute[186241]: 2025-11-25 06:30:04.647 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:30:04 compute-0 nova_compute[186241]: 2025-11-25 06:30:04.647 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:3a:48:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:30:04 compute-0 nova_compute[186241]: 2025-11-25 06:30:04.648 186245 INFO nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Using config drive
Nov 25 06:30:06 compute-0 podman[216739]: 2025-11-25 06:30:06.07422732 +0000 UTC m=+0.053473351 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.068 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:07 compute-0 ovn_controller[95135]: 2025-11-25T06:30:07Z|00140|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.282 186245 INFO nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Creating config drive at /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk.config
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.287 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpk07bv3za execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.403 186245 DEBUG oslo_concurrency.processutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmpk07bv3za" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:30:07 compute-0 kernel: tap975b8d2c-d4: entered promiscuous mode
Nov 25 06:30:07 compute-0 NetworkManager[55345]: <info>  [1764052207.4377] manager: (tap975b8d2c-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.439 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:07 compute-0 ovn_controller[95135]: 2025-11-25T06:30:07Z|00141|binding|INFO|Claiming lport 975b8d2c-d44e-424b-8044-846b81925518 for this chassis.
Nov 25 06:30:07 compute-0 ovn_controller[95135]: 2025-11-25T06:30:07Z|00142|binding|INFO|975b8d2c-d44e-424b-8044-846b81925518: Claiming fa:16:3e:3a:48:6c 10.100.0.12
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.449 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:48:6c 10.100.0.12'], port_security=['fa:16:3e:3a:48:6c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '38ebec87-f0fc-428a-9751-f97953e7c554', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726f016a-ee65-4a75-be87-3386221dc835', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f947ef18-6e76-4aa1-80a8-1dfd2828a0b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dcc51a4-c804-4eab-90b5-720685a9ca99, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=975b8d2c-d44e-424b-8044-846b81925518) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.450 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 975b8d2c-d44e-424b-8044-846b81925518 in datapath 726f016a-ee65-4a75-be87-3386221dc835 bound to our chassis
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.451 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 726f016a-ee65-4a75-be87-3386221dc835
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.459 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[41df651e-f89a-4a88-8f8f-1c80492252b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.460 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap726f016a-e1 in ovnmeta-726f016a-ee65-4a75-be87-3386221dc835 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.461 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap726f016a-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.461 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[af88234b-14a9-4193-a5c0-a120ba15db22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.462 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6eb97e-3ac8-4f9e-934d-c17a7ee25619]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 systemd-udevd[216780]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.472 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[be17ab49-fecb-4743-bea3-de192ef0eba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 NetworkManager[55345]: <info>  [1764052207.4735] device (tap975b8d2c-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:30:07 compute-0 NetworkManager[55345]: <info>  [1764052207.4743] device (tap975b8d2c-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:30:07 compute-0 systemd-machined[152921]: New machine qemu-10-instance-0000000a.
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.499 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.503 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[cd524ca0-7e9d-4f2a-a82a-ca310729bd50]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_controller[95135]: 2025-11-25T06:30:07Z|00143|binding|INFO|Setting lport 975b8d2c-d44e-424b-8044-846b81925518 ovn-installed in OVS
Nov 25 06:30:07 compute-0 ovn_controller[95135]: 2025-11-25T06:30:07Z|00144|binding|INFO|Setting lport 975b8d2c-d44e-424b-8044-846b81925518 up in Southbound
Nov 25 06:30:07 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.506 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.523 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa96ca7-b27a-497e-83fd-6a12b972408d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 systemd-udevd[216783]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:30:07 compute-0 NetworkManager[55345]: <info>  [1764052207.5280] manager: (tap726f016a-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.527 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a03f4394-8e11-48f6-a8c4-cbef8862f67f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.552 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d606db-d52e-41a8-aaae-9669aa75216b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.555 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[a4eb7c1b-b52f-48ea-849f-001b0417d83a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 NetworkManager[55345]: <info>  [1764052207.5712] device (tap726f016a-e0): carrier: link connected
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.574 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8f44fb-2d33-4b40-ac8a-35f907a08498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.587 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[92b5cc4b-5e54-4f00-96b0-62162d5ad378]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726f016a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:08:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319597, 'reachable_time': 15228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216805, 'error': None, 'target': 'ovnmeta-726f016a-ee65-4a75-be87-3386221dc835', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.597 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb79cfa-f8ac-44de-aa93-0651eaf5235b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:83d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 319597, 'tstamp': 319597}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216806, 'error': None, 'target': 'ovnmeta-726f016a-ee65-4a75-be87-3386221dc835', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.608 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[2600038a-1df8-4fb5-a872-c6da8a272b8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap726f016a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:08:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319597, 'reachable_time': 15228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216807, 'error': None, 'target': 'ovnmeta-726f016a-ee65-4a75-be87-3386221dc835', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.629 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d959d75d-2b9d-4a69-a87b-f430902d9236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.666 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f996b91b-a8b6-4584-a740-c673d6af0cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.667 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726f016a-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.667 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.668 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726f016a-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:07 compute-0 kernel: tap726f016a-e0: entered promiscuous mode
Nov 25 06:30:07 compute-0 NetworkManager[55345]: <info>  [1764052207.6712] manager: (tap726f016a-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.673 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.676 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap726f016a-e0, col_values=(('external_ids', {'iface-id': '3480c600-85f3-459e-80b0-348f3e309bfd'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:07 compute-0 ovn_controller[95135]: 2025-11-25T06:30:07Z|00145|binding|INFO|Releasing lport 3480c600-85f3-459e-80b0-348f3e309bfd from this chassis (sb_readonly=0)
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.688 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:07 compute-0 nova_compute[186241]: 2025-11-25 06:30:07.690 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.691 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[37eaea6c-d3da-4539-a9cb-a6125526d4ec]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.691 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.691 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.691 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 726f016a-ee65-4a75-be87-3386221dc835 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.691 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.692 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[cafdb64b-8d6f-486d-b499-a9ed7a78b69f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.692 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.692 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b453ede9-908d-4dd0-89a0-460157e6e276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.692 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-726f016a-ee65-4a75-be87-3386221dc835
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 726f016a-ee65-4a75-be87-3386221dc835
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:30:07 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:07.693 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-726f016a-ee65-4a75-be87-3386221dc835', 'env', 'PROCESS_TAG=haproxy-726f016a-ee65-4a75-be87-3386221dc835', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/726f016a-ee65-4a75-be87-3386221dc835.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:30:07 compute-0 podman[216837]: 2025-11-25 06:30:07.973307765 +0000 UTC m=+0.031538763 container create d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 25 06:30:08 compute-0 systemd[1]: Started libpod-conmon-d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8.scope.
Nov 25 06:30:08 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:30:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2b665e75d2f7d9849b4a82faab80c98c26f66f6f496cbb69a04956edd2f1c0c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:30:08 compute-0 podman[216837]: 2025-11-25 06:30:08.030612008 +0000 UTC m=+0.088843015 container init d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 06:30:08 compute-0 podman[216837]: 2025-11-25 06:30:08.036055899 +0000 UTC m=+0.094286887 container start d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:30:08 compute-0 podman[216837]: 2025-11-25 06:30:07.957764886 +0000 UTC m=+0.015995894 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:30:08 compute-0 neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835[216849]: [NOTICE]   (216853) : New worker (216855) forked
Nov 25 06:30:08 compute-0 neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835[216849]: [NOTICE]   (216853) : Loading success.
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.123 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.305 186245 DEBUG nova.compute.manager [req-7548cee6-6cbe-4d03-888a-aa99b3065ad5 req-3250f662-a636-44f5-a28b-6def73273dbf a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.305 186245 DEBUG oslo_concurrency.lockutils [req-7548cee6-6cbe-4d03-888a-aa99b3065ad5 req-3250f662-a636-44f5-a28b-6def73273dbf a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.306 186245 DEBUG oslo_concurrency.lockutils [req-7548cee6-6cbe-4d03-888a-aa99b3065ad5 req-3250f662-a636-44f5-a28b-6def73273dbf a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.306 186245 DEBUG oslo_concurrency.lockutils [req-7548cee6-6cbe-4d03-888a-aa99b3065ad5 req-3250f662-a636-44f5-a28b-6def73273dbf a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.307 186245 DEBUG nova.compute.manager [req-7548cee6-6cbe-4d03-888a-aa99b3065ad5 req-3250f662-a636-44f5-a28b-6def73273dbf a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Processing event network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.563 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.566 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.568 186245 INFO nova.virt.libvirt.driver [-] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Instance spawned successfully.
Nov 25 06:30:08 compute-0 nova_compute[186241]: 2025-11-25 06:30:08.569 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.077 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.078 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.078 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.079 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.079 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.080 186245 DEBUG nova.virt.libvirt.driver [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.587 186245 INFO nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Took 13.44 seconds to spawn the instance on the hypervisor.
Nov 25 06:30:09 compute-0 nova_compute[186241]: 2025-11-25 06:30:09.588 186245 DEBUG nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:30:10 compute-0 podman[216867]: 2025-11-25 06:30:10.07059427 +0000 UTC m=+0.050755568 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 06:30:10 compute-0 podman[216868]: 2025-11-25 06:30:10.079455664 +0000 UTC m=+0.056884883 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.102 186245 INFO nova.compute.manager [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Took 18.56 seconds to build instance.
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.495 186245 DEBUG nova.compute.manager [req-79819163-1bcc-4dea-a9dd-8b4dd44bb043 req-cb81dcd9-f9fa-4016-845a-6ff8fad90b74 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.496 186245 DEBUG oslo_concurrency.lockutils [req-79819163-1bcc-4dea-a9dd-8b4dd44bb043 req-cb81dcd9-f9fa-4016-845a-6ff8fad90b74 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.497 186245 DEBUG oslo_concurrency.lockutils [req-79819163-1bcc-4dea-a9dd-8b4dd44bb043 req-cb81dcd9-f9fa-4016-845a-6ff8fad90b74 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.497 186245 DEBUG oslo_concurrency.lockutils [req-79819163-1bcc-4dea-a9dd-8b4dd44bb043 req-cb81dcd9-f9fa-4016-845a-6ff8fad90b74 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.497 186245 DEBUG nova.compute.manager [req-79819163-1bcc-4dea-a9dd-8b4dd44bb043 req-cb81dcd9-f9fa-4016-845a-6ff8fad90b74 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] No waiting events found dispatching network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.497 186245 WARNING nova.compute.manager [req-79819163-1bcc-4dea-a9dd-8b4dd44bb043 req-cb81dcd9-f9fa-4016-845a-6ff8fad90b74 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received unexpected event network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 for instance with vm_state active and task_state None.
Nov 25 06:30:10 compute-0 nova_compute[186241]: 2025-11-25 06:30:10.605 186245 DEBUG oslo_concurrency.lockutils [None req-6407be6e-b2bb-4049-8f90-334b4484e97b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:12 compute-0 nova_compute[186241]: 2025-11-25 06:30:12.070 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:13 compute-0 nova_compute[186241]: 2025-11-25 06:30:13.124 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:14 compute-0 podman[216905]: 2025-11-25 06:30:14.056097411 +0000 UTC m=+0.036090975 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:30:16 compute-0 ovn_controller[95135]: 2025-11-25T06:30:16Z|00146|binding|INFO|Releasing lport 3480c600-85f3-459e-80b0-348f3e309bfd from this chassis (sb_readonly=0)
Nov 25 06:30:16 compute-0 nova_compute[186241]: 2025-11-25 06:30:16.387 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:16 compute-0 NetworkManager[55345]: <info>  [1764052216.3891] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Nov 25 06:30:16 compute-0 NetworkManager[55345]: <info>  [1764052216.3898] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Nov 25 06:30:16 compute-0 ovn_controller[95135]: 2025-11-25T06:30:16Z|00147|binding|INFO|Releasing lport 3480c600-85f3-459e-80b0-348f3e309bfd from this chassis (sb_readonly=0)
Nov 25 06:30:16 compute-0 nova_compute[186241]: 2025-11-25 06:30:16.411 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:16 compute-0 nova_compute[186241]: 2025-11-25 06:30:16.414 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:17 compute-0 nova_compute[186241]: 2025-11-25 06:30:17.071 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:17 compute-0 nova_compute[186241]: 2025-11-25 06:30:17.687 186245 DEBUG nova.compute.manager [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-changed-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:30:17 compute-0 nova_compute[186241]: 2025-11-25 06:30:17.687 186245 DEBUG nova.compute.manager [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Refreshing instance network info cache due to event network-changed-975b8d2c-d44e-424b-8044-846b81925518. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:30:17 compute-0 nova_compute[186241]: 2025-11-25 06:30:17.688 186245 DEBUG oslo_concurrency.lockutils [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:30:17 compute-0 nova_compute[186241]: 2025-11-25 06:30:17.688 186245 DEBUG oslo_concurrency.lockutils [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:30:17 compute-0 nova_compute[186241]: 2025-11-25 06:30:17.688 186245 DEBUG nova.network.neutron [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Refreshing network info cache for port 975b8d2c-d44e-424b-8044-846b81925518 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:30:18 compute-0 nova_compute[186241]: 2025-11-25 06:30:18.127 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:18 compute-0 nova_compute[186241]: 2025-11-25 06:30:18.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:18 compute-0 nova_compute[186241]: 2025-11-25 06:30:18.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Nov 25 06:30:19 compute-0 podman[216936]: 2025-11-25 06:30:19.062097266 +0000 UTC m=+0.039986768 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc.)
Nov 25 06:30:19 compute-0 ovn_controller[95135]: 2025-11-25T06:30:19Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3a:48:6c 10.100.0.12
Nov 25 06:30:19 compute-0 ovn_controller[95135]: 2025-11-25T06:30:19Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:48:6c 10.100.0.12
Nov 25 06:30:21 compute-0 nova_compute[186241]: 2025-11-25 06:30:21.289 186245 DEBUG nova.network.neutron [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updated VIF entry in instance network info cache for port 975b8d2c-d44e-424b-8044-846b81925518. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:30:21 compute-0 nova_compute[186241]: 2025-11-25 06:30:21.290 186245 DEBUG nova.network.neutron [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updating instance_info_cache with network_info: [{"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:30:21 compute-0 nova_compute[186241]: 2025-11-25 06:30:21.793 186245 DEBUG oslo_concurrency.lockutils [req-c88e411b-5e98-4c41-8673-0f6245ba349f req-541da827-a213-4587-ae2f-08446edf7dd3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:30:22 compute-0 nova_compute[186241]: 2025-11-25 06:30:22.072 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:23 compute-0 nova_compute[186241]: 2025-11-25 06:30:23.128 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:24 compute-0 nova_compute[186241]: 2025-11-25 06:30:24.392 186245 INFO nova.compute.manager [None req-a4b19fc6-3dde-4e86-9672-a8bd29f51d8a 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Get console output
Nov 25 06:30:24 compute-0 nova_compute[186241]: 2025-11-25 06:30:24.395 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:30:25 compute-0 podman[216954]: 2025-11-25 06:30:25.057708378 +0000 UTC m=+0.036912928 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125)
Nov 25 06:30:25 compute-0 nova_compute[186241]: 2025-11-25 06:30:25.435 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:25 compute-0 nova_compute[186241]: 2025-11-25 06:30:25.436 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:26 compute-0 nova_compute[186241]: 2025-11-25 06:30:26.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:27 compute-0 nova_compute[186241]: 2025-11-25 06:30:27.074 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:27 compute-0 ovn_controller[95135]: 2025-11-25T06:30:27Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:48:6c 10.100.0.12
Nov 25 06:30:28 compute-0 nova_compute[186241]: 2025-11-25 06:30:28.130 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:28 compute-0 nova_compute[186241]: 2025-11-25 06:30:28.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:29 compute-0 podman[216972]: 2025-11-25 06:30:29.056039303 +0000 UTC m=+0.035812826 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:30:29 compute-0 nova_compute[186241]: 2025-11-25 06:30:29.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:29 compute-0 nova_compute[186241]: 2025-11-25 06:30:29.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:29 compute-0 nova_compute[186241]: 2025-11-25 06:30:29.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:29 compute-0 nova_compute[186241]: 2025-11-25 06:30:29.444 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.470 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:30:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:30.509 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:30:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:30.510 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.510 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.528 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.529 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.582 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.600 186245 DEBUG nova.compute.manager [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-changed-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.600 186245 DEBUG nova.compute.manager [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Refreshing instance network info cache due to event network-changed-975b8d2c-d44e-424b-8044-846b81925518. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.601 186245 DEBUG oslo_concurrency.lockutils [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.601 186245 DEBUG oslo_concurrency.lockutils [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.601 186245 DEBUG nova.network.neutron [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Refreshing network info cache for port 975b8d2c-d44e-424b-8044-846b81925518 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.767 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.767 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5577MB free_disk=72.98893356323242GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.768 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:30 compute-0 nova_compute[186241]: 2025-11-25 06:30:30.768 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.143 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.143 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.143 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.143 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.144 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.144 186245 INFO nova.compute.manager [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Terminating instance
Nov 25 06:30:31 compute-0 ovn_controller[95135]: 2025-11-25T06:30:31Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:48:6c 10.100.0.12
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.647 186245 DEBUG nova.compute.manager [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:30:31 compute-0 kernel: tap975b8d2c-d4 (unregistering): left promiscuous mode
Nov 25 06:30:31 compute-0 NetworkManager[55345]: <info>  [1764052231.6689] device (tap975b8d2c-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:30:31 compute-0 ovn_controller[95135]: 2025-11-25T06:30:31Z|00148|binding|INFO|Releasing lport 975b8d2c-d44e-424b-8044-846b81925518 from this chassis (sb_readonly=0)
Nov 25 06:30:31 compute-0 ovn_controller[95135]: 2025-11-25T06:30:31Z|00149|binding|INFO|Setting lport 975b8d2c-d44e-424b-8044-846b81925518 down in Southbound
Nov 25 06:30:31 compute-0 ovn_controller[95135]: 2025-11-25T06:30:31Z|00150|binding|INFO|Removing iface tap975b8d2c-d4 ovn-installed in OVS
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.677 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.689 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:48:6c 10.100.0.12'], port_security=['fa:16:3e:3a:48:6c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '38ebec87-f0fc-428a-9751-f97953e7c554', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-726f016a-ee65-4a75-be87-3386221dc835', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f947ef18-6e76-4aa1-80a8-1dfd2828a0b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dcc51a4-c804-4eab-90b5-720685a9ca99, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=975b8d2c-d44e-424b-8044-846b81925518) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.690 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 975b8d2c-d44e-424b-8044-846b81925518 in datapath 726f016a-ee65-4a75-be87-3386221dc835 unbound from our chassis
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.692 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 726f016a-ee65-4a75-be87-3386221dc835, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.692 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[6954a082-b390-4638-9dc2-b63972cff416]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.693 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-726f016a-ee65-4a75-be87-3386221dc835 namespace which is not needed anymore
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.697 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:31 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 25 06:30:31 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 11.484s CPU time.
Nov 25 06:30:31 compute-0 systemd-machined[152921]: Machine qemu-10-instance-0000000a terminated.
Nov 25 06:30:31 compute-0 neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835[216849]: [NOTICE]   (216853) : haproxy version is 2.8.14-c23fe91
Nov 25 06:30:31 compute-0 neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835[216849]: [NOTICE]   (216853) : path to executable is /usr/sbin/haproxy
Nov 25 06:30:31 compute-0 neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835[216849]: [WARNING]  (216853) : Exiting Master process...
Nov 25 06:30:31 compute-0 podman[217021]: 2025-11-25 06:30:31.770016117 +0000 UTC m=+0.018718965 container kill d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:30:31 compute-0 neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835[216849]: [ALERT]    (216853) : Current worker (216855) exited with code 143 (Terminated)
Nov 25 06:30:31 compute-0 neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835[216849]: [WARNING]  (216853) : All workers exited. Exiting... (0)
Nov 25 06:30:31 compute-0 systemd[1]: libpod-d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8.scope: Deactivated successfully.
Nov 25 06:30:31 compute-0 podman[217033]: 2025-11-25 06:30:31.798232076 +0000 UTC m=+0.014583000 container died d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.802 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 38ebec87-f0fc-428a-9751-f97953e7c554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.802 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.803 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:30:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8-userdata-shm.mount: Deactivated successfully.
Nov 25 06:30:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-f2b665e75d2f7d9849b4a82faab80c98c26f66f6f496cbb69a04956edd2f1c0c-merged.mount: Deactivated successfully.
Nov 25 06:30:31 compute-0 podman[217033]: 2025-11-25 06:30:31.815897564 +0000 UTC m=+0.032248488 container cleanup d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:30:31 compute-0 systemd[1]: libpod-conmon-d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8.scope: Deactivated successfully.
Nov 25 06:30:31 compute-0 podman[217034]: 2025-11-25 06:30:31.823853766 +0000 UTC m=+0.034358656 container remove d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.827 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[56098ee7-81c6-4252-a61c-ad45eb69f4a2]: (4, ("Tue Nov 25 06:30:31 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835 (d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8)\nd688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8\nTue Nov 25 06:30:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-726f016a-ee65-4a75-be87-3386221dc835 (d688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8)\nd688d4320e66f1938c72bc16abc16c71ba2606500834d3cb764ac13237c4b0e8\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.828 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[2b29b78c-7238-4289-80cd-88d618e6b4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.828 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/726f016a-ee65-4a75-be87-3386221dc835.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.828 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a9fe19-fd94-4548-b620-bfe7522aac43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.829 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726f016a-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:31 compute-0 kernel: tap726f016a-e0: left promiscuous mode
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.830 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.841 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.845 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.847 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8e29c821-3b71-40b8-b77b-815dcbdfd1a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.854 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b81dfe6f-f215-4572-a379-f4598f74246a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.856 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5b67582e-c59b-4716-8a98-72ab26c2637d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.860 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:31 compute-0 systemd[1]: run-netns-ovnmeta\x2d726f016a\x2dee65\x2d4a75\x2dbe87\x2d3386221dc835.mount: Deactivated successfully.
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.868 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[150d6255-e731-4907-a375-33caf68249f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319592, 'reachable_time': 23730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217062, 'error': None, 'target': 'ovnmeta-726f016a-ee65-4a75-be87-3386221dc835', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.871 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-726f016a-ee65-4a75-be87-3386221dc835 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:30:31 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:31.871 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e80439-b2da-463e-962b-cdd69fde695f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.879 186245 INFO nova.virt.libvirt.driver [-] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Instance destroyed successfully.
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.879 186245 DEBUG nova.objects.instance [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 38ebec87-f0fc-428a-9751-f97953e7c554 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.921 186245 DEBUG nova.compute.manager [req-d918a06e-5f2d-47cb-a55f-a4e2f154794b req-ce470cd2-1ce0-4f42-9fc1-38a3fded3a0f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-vif-unplugged-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.921 186245 DEBUG oslo_concurrency.lockutils [req-d918a06e-5f2d-47cb-a55f-a4e2f154794b req-ce470cd2-1ce0-4f42-9fc1-38a3fded3a0f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.922 186245 DEBUG oslo_concurrency.lockutils [req-d918a06e-5f2d-47cb-a55f-a4e2f154794b req-ce470cd2-1ce0-4f42-9fc1-38a3fded3a0f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.922 186245 DEBUG oslo_concurrency.lockutils [req-d918a06e-5f2d-47cb-a55f-a4e2f154794b req-ce470cd2-1ce0-4f42-9fc1-38a3fded3a0f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.922 186245 DEBUG nova.compute.manager [req-d918a06e-5f2d-47cb-a55f-a4e2f154794b req-ce470cd2-1ce0-4f42-9fc1-38a3fded3a0f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] No waiting events found dispatching network-vif-unplugged-975b8d2c-d44e-424b-8044-846b81925518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:30:31 compute-0 nova_compute[186241]: 2025-11-25 06:30:31.922 186245 DEBUG nova.compute.manager [req-d918a06e-5f2d-47cb-a55f-a4e2f154794b req-ce470cd2-1ce0-4f42-9fc1-38a3fded3a0f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-vif-unplugged-975b8d2c-d44e-424b-8044-846b81925518 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.076 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.344 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.382 186245 DEBUG nova.virt.libvirt.vif [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:29:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1169628092',display_name='tempest-TestNetworkBasicOps-server-1169628092',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1169628092',id=10,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE2FhW+QNubceZJArZWViP3HHnq1MkVj3LYH+Qb2Y8Y1wEj4tHjkcM4k8WY26rNPXpnbw/RMSTjbF6xLvLrT3mCEwwBuqLYmryFvMYCCAVPPEsKuI63nuHc/l/9LebrMsA==',key_name='tempest-TestNetworkBasicOps-454611679',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:30:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-f28024go',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:30:09Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=38ebec87-f0fc-428a-9751-f97953e7c554,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.382 186245 DEBUG nova.network.os_vif_util [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.383 186245 DEBUG nova.network.os_vif_util [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3a:48:6c,bridge_name='br-int',has_traffic_filtering=True,id=975b8d2c-d44e-424b-8044-846b81925518,network=Network(726f016a-ee65-4a75-be87-3386221dc835),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap975b8d2c-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.383 186245 DEBUG os_vif [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3a:48:6c,bridge_name='br-int',has_traffic_filtering=True,id=975b8d2c-d44e-424b-8044-846b81925518,network=Network(726f016a-ee65-4a75-be87-3386221dc835),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap975b8d2c-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.384 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.384 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap975b8d2c-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.387 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.387 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.388 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=32c9e002-068d-43b1-9abb-8b0949dd3eb9) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.388 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.389 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.390 186245 INFO os_vif [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3a:48:6c,bridge_name='br-int',has_traffic_filtering=True,id=975b8d2c-d44e-424b-8044-846b81925518,network=Network(726f016a-ee65-4a75-be87-3386221dc835),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap975b8d2c-d4')
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.391 186245 INFO nova.virt.libvirt.driver [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Deleting instance files /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554_del
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.391 186245 INFO nova.virt.libvirt.driver [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Deletion of /var/lib/nova/instances/38ebec87-f0fc-428a-9751-f97953e7c554_del complete
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.848 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.848 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.899 186245 INFO nova.compute.manager [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Took 1.25 seconds to destroy the instance on the hypervisor.
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.899 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.899 186245 DEBUG nova.compute.manager [-] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.899 186245 DEBUG nova.network.neutron [-] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:32 compute-0 nova_compute[186241]: 2025-11-25 06:30:32.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Nov 25 06:30:33 compute-0 nova_compute[186241]: 2025-11-25 06:30:33.165 186245 DEBUG nova.network.neutron [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updated VIF entry in instance network info cache for port 975b8d2c-d44e-424b-8044-846b81925518. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:30:33 compute-0 nova_compute[186241]: 2025-11-25 06:30:33.165 186245 DEBUG nova.network.neutron [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updating instance_info_cache with network_info: [{"id": "975b8d2c-d44e-424b-8044-846b81925518", "address": "fa:16:3e:3a:48:6c", "network": {"id": "726f016a-ee65-4a75-be87-3386221dc835", "bridge": "br-int", "label": "tempest-network-smoke--1514542140", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap975b8d2c-d4", "ovs_interfaceid": "975b8d2c-d44e-424b-8044-846b81925518", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:30:33 compute-0 nova_compute[186241]: 2025-11-25 06:30:33.436 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Nov 25 06:30:33 compute-0 nova_compute[186241]: 2025-11-25 06:30:33.436 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:30:33 compute-0 nova_compute[186241]: 2025-11-25 06:30:33.667 186245 DEBUG oslo_concurrency.lockutils [req-10f83926-9b13-4953-bc43-64a7311aafbd req-8f658078-c0df-49e5-b9f3-3e53f323cff3 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-38ebec87-f0fc-428a-9751-f97953e7c554" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:30:34 compute-0 nova_compute[186241]: 2025-11-25 06:30:34.119 186245 DEBUG nova.compute.manager [req-296c0781-f63f-4fef-8bdb-24074e353d95 req-7724ca8d-4ed9-4cde-ae16-cf586c7664b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:30:34 compute-0 nova_compute[186241]: 2025-11-25 06:30:34.119 186245 DEBUG oslo_concurrency.lockutils [req-296c0781-f63f-4fef-8bdb-24074e353d95 req-7724ca8d-4ed9-4cde-ae16-cf586c7664b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:34 compute-0 nova_compute[186241]: 2025-11-25 06:30:34.119 186245 DEBUG oslo_concurrency.lockutils [req-296c0781-f63f-4fef-8bdb-24074e353d95 req-7724ca8d-4ed9-4cde-ae16-cf586c7664b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:34 compute-0 nova_compute[186241]: 2025-11-25 06:30:34.119 186245 DEBUG oslo_concurrency.lockutils [req-296c0781-f63f-4fef-8bdb-24074e353d95 req-7724ca8d-4ed9-4cde-ae16-cf586c7664b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:34 compute-0 nova_compute[186241]: 2025-11-25 06:30:34.120 186245 DEBUG nova.compute.manager [req-296c0781-f63f-4fef-8bdb-24074e353d95 req-7724ca8d-4ed9-4cde-ae16-cf586c7664b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] No waiting events found dispatching network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:30:34 compute-0 nova_compute[186241]: 2025-11-25 06:30:34.120 186245 WARNING nova.compute.manager [req-296c0781-f63f-4fef-8bdb-24074e353d95 req-7724ca8d-4ed9-4cde-ae16-cf586c7664b5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received unexpected event network-vif-plugged-975b8d2c-d44e-424b-8044-846b81925518 for instance with vm_state active and task_state deleting.
Nov 25 06:30:35 compute-0 nova_compute[186241]: 2025-11-25 06:30:35.361 186245 DEBUG nova.compute.manager [req-6bfb3ff0-c4ce-4c2b-b84e-894f55861b78 req-a1955869-a18f-4a7c-99f4-ee01ffb0342f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Received event network-vif-deleted-975b8d2c-d44e-424b-8044-846b81925518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:30:35 compute-0 nova_compute[186241]: 2025-11-25 06:30:35.361 186245 INFO nova.compute.manager [req-6bfb3ff0-c4ce-4c2b-b84e-894f55861b78 req-a1955869-a18f-4a7c-99f4-ee01ffb0342f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Neutron deleted interface 975b8d2c-d44e-424b-8044-846b81925518; detaching it from the instance and deleting it from the info cache
Nov 25 06:30:35 compute-0 nova_compute[186241]: 2025-11-25 06:30:35.362 186245 DEBUG nova.network.neutron [req-6bfb3ff0-c4ce-4c2b-b84e-894f55861b78 req-a1955869-a18f-4a7c-99f4-ee01ffb0342f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:30:35 compute-0 nova_compute[186241]: 2025-11-25 06:30:35.703 186245 DEBUG nova.network.neutron [-] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:30:35 compute-0 nova_compute[186241]: 2025-11-25 06:30:35.865 186245 DEBUG nova.compute.manager [req-6bfb3ff0-c4ce-4c2b-b84e-894f55861b78 req-a1955869-a18f-4a7c-99f4-ee01ffb0342f a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Detach interface failed, port_id=975b8d2c-d44e-424b-8044-846b81925518, reason: Instance 38ebec87-f0fc-428a-9751-f97953e7c554 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:30:36 compute-0 nova_compute[186241]: 2025-11-25 06:30:36.206 186245 INFO nova.compute.manager [-] [instance: 38ebec87-f0fc-428a-9751-f97953e7c554] Took 3.31 seconds to deallocate network for instance.
Nov 25 06:30:36 compute-0 nova_compute[186241]: 2025-11-25 06:30:36.712 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:36 compute-0 nova_compute[186241]: 2025-11-25 06:30:36.712 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:36 compute-0 nova_compute[186241]: 2025-11-25 06:30:36.746 186245 DEBUG nova.compute.provider_tree [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:30:37 compute-0 nova_compute[186241]: 2025-11-25 06:30:37.078 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:37 compute-0 podman[217077]: 2025-11-25 06:30:37.108053229 +0000 UTC m=+0.082970138 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 06:30:37 compute-0 nova_compute[186241]: 2025-11-25 06:30:37.250 186245 DEBUG nova.scheduler.client.report [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:30:37 compute-0 nova_compute[186241]: 2025-11-25 06:30:37.389 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:37 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:37.511 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:30:37 compute-0 nova_compute[186241]: 2025-11-25 06:30:37.756 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:37 compute-0 nova_compute[186241]: 2025-11-25 06:30:37.777 186245 INFO nova.scheduler.client.report [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 38ebec87-f0fc-428a-9751-f97953e7c554
Nov 25 06:30:38 compute-0 nova_compute[186241]: 2025-11-25 06:30:38.786 186245 DEBUG oslo_concurrency.lockutils [None req-9c2d8217-2677-42ea-af22-8d6315bce224 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "38ebec87-f0fc-428a-9751-f97953e7c554" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:41 compute-0 podman[217101]: 2025-11-25 06:30:41.060114374 +0000 UTC m=+0.037953700 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:30:41 compute-0 podman[217100]: 2025-11-25 06:30:41.063213855 +0000 UTC m=+0.042585218 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:30:42 compute-0 nova_compute[186241]: 2025-11-25 06:30:42.079 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:42 compute-0 nova_compute[186241]: 2025-11-25 06:30:42.341 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:42 compute-0 nova_compute[186241]: 2025-11-25 06:30:42.391 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:42 compute-0 nova_compute[186241]: 2025-11-25 06:30:42.417 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:45 compute-0 podman[217139]: 2025-11-25 06:30:45.05585185 +0000 UTC m=+0.034964467 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 06:30:47 compute-0 nova_compute[186241]: 2025-11-25 06:30:47.081 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:47 compute-0 nova_compute[186241]: 2025-11-25 06:30:47.392 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:47.715 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:30:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:47.715 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:30:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:47.716 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:30:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:48.173 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:9d:a8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=203ab58e-73f0-45ec-9572-3acf3a7b4768, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4da4604e-e99e-4437-9f7c-6c43af2847ac) old=Port_Binding(mac=['fa:16:3e:f7:9d:a8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:30:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:48.174 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4da4604e-e99e-4437-9f7c-6c43af2847ac in datapath 949ef554-1519-45e1-97c2-6c679a7a80e3 updated
Nov 25 06:30:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:48.175 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 949ef554-1519-45e1-97c2-6c679a7a80e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:30:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:30:48.175 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[0704f6f8-7069-4043-bd11-cd3646ace7cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:30:50 compute-0 podman[217156]: 2025-11-25 06:30:50.060929156 +0000 UTC m=+0.036290516 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Nov 25 06:30:52 compute-0 nova_compute[186241]: 2025-11-25 06:30:52.082 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:52 compute-0 nova_compute[186241]: 2025-11-25 06:30:52.393 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:56 compute-0 podman[217174]: 2025-11-25 06:30:56.054925042 +0000 UTC m=+0.034389003 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 06:30:57 compute-0 nova_compute[186241]: 2025-11-25 06:30:57.083 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:30:57 compute-0 nova_compute[186241]: 2025-11-25 06:30:57.394 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:00 compute-0 podman[217191]: 2025-11-25 06:31:00.053843022 +0000 UTC m=+0.034425721 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:31:02 compute-0 nova_compute[186241]: 2025-11-25 06:31:02.084 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:02 compute-0 nova_compute[186241]: 2025-11-25 06:31:02.395 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:06 compute-0 nova_compute[186241]: 2025-11-25 06:31:06.961 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:06 compute-0 nova_compute[186241]: 2025-11-25 06:31:06.961 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:07 compute-0 nova_compute[186241]: 2025-11-25 06:31:07.086 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:07 compute-0 nova_compute[186241]: 2025-11-25 06:31:07.396 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:07 compute-0 nova_compute[186241]: 2025-11-25 06:31:07.463 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:31:07 compute-0 nova_compute[186241]: 2025-11-25 06:31:07.990 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:07 compute-0 nova_compute[186241]: 2025-11-25 06:31:07.990 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:07 compute-0 nova_compute[186241]: 2025-11-25 06:31:07.995 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:31:07 compute-0 nova_compute[186241]: 2025-11-25 06:31:07.996 186245 INFO nova.compute.claims [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:31:08 compute-0 podman[217211]: 2025-11-25 06:31:08.080936983 +0000 UTC m=+0.057012203 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 25 06:31:09 compute-0 nova_compute[186241]: 2025-11-25 06:31:09.032 186245 DEBUG nova.compute.provider_tree [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:31:09 compute-0 nova_compute[186241]: 2025-11-25 06:31:09.535 186245 DEBUG nova.scheduler.client.report [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:31:10 compute-0 nova_compute[186241]: 2025-11-25 06:31:10.040 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:10 compute-0 nova_compute[186241]: 2025-11-25 06:31:10.041 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:31:10 compute-0 nova_compute[186241]: 2025-11-25 06:31:10.546 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:31:10 compute-0 nova_compute[186241]: 2025-11-25 06:31:10.547 186245 DEBUG nova.network.neutron [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:31:11 compute-0 nova_compute[186241]: 2025-11-25 06:31:11.051 186245 INFO nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:31:11 compute-0 nova_compute[186241]: 2025-11-25 06:31:11.296 186245 DEBUG nova.policy [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:31:11 compute-0 nova_compute[186241]: 2025-11-25 06:31:11.555 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:31:12 compute-0 podman[217236]: 2025-11-25 06:31:12.066701055 +0000 UTC m=+0.039636931 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:31:12 compute-0 podman[217235]: 2025-11-25 06:31:12.070979296 +0000 UTC m=+0.045084816 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.086 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.398 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.507 186245 DEBUG nova.network.neutron [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Successfully created port: af54200a-3890-4538-8af0-4a157900fd41 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.564 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.565 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.565 186245 INFO nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Creating image(s)
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.565 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.566 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.566 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.567 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.569 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.570 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.615 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.616 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.617 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.618 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.621 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.621 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.665 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.666 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.686 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.687 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.687 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.732 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.733 186245 DEBUG nova.virt.disk.api [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.733 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.777 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.778 186245 DEBUG nova.virt.disk.api [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.779 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.779 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Ensure instance console log exists: /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.779 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.780 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:12 compute-0 nova_compute[186241]: 2025-11-25 06:31:12.780 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:13 compute-0 nova_compute[186241]: 2025-11-25 06:31:13.490 186245 DEBUG nova.network.neutron [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Successfully updated port: af54200a-3890-4538-8af0-4a157900fd41 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:31:13 compute-0 nova_compute[186241]: 2025-11-25 06:31:13.680 186245 DEBUG nova.compute.manager [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-changed-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:31:13 compute-0 nova_compute[186241]: 2025-11-25 06:31:13.680 186245 DEBUG nova.compute.manager [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing instance network info cache due to event network-changed-af54200a-3890-4538-8af0-4a157900fd41. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:31:13 compute-0 nova_compute[186241]: 2025-11-25 06:31:13.680 186245 DEBUG oslo_concurrency.lockutils [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:31:13 compute-0 nova_compute[186241]: 2025-11-25 06:31:13.680 186245 DEBUG oslo_concurrency.lockutils [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:31:13 compute-0 nova_compute[186241]: 2025-11-25 06:31:13.681 186245 DEBUG nova.network.neutron [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing network info cache for port af54200a-3890-4538-8af0-4a157900fd41 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:31:13 compute-0 nova_compute[186241]: 2025-11-25 06:31:13.995 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:31:15 compute-0 nova_compute[186241]: 2025-11-25 06:31:15.056 186245 DEBUG nova.network.neutron [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:31:16 compute-0 podman[217290]: 2025-11-25 06:31:16.0618815 +0000 UTC m=+0.037254371 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:31:16 compute-0 nova_compute[186241]: 2025-11-25 06:31:16.278 186245 DEBUG nova.network.neutron [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:31:16 compute-0 nova_compute[186241]: 2025-11-25 06:31:16.781 186245 DEBUG oslo_concurrency.lockutils [req-ee48cc80-04a8-4472-80f1-cc386383e77d req-4f790e72-9a1b-4b33-b2e1-6f0ea87a30c7 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:31:16 compute-0 nova_compute[186241]: 2025-11-25 06:31:16.782 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:31:16 compute-0 nova_compute[186241]: 2025-11-25 06:31:16.782 186245 DEBUG nova.network.neutron [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:31:17 compute-0 nova_compute[186241]: 2025-11-25 06:31:17.088 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:17 compute-0 nova_compute[186241]: 2025-11-25 06:31:17.399 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:17 compute-0 nova_compute[186241]: 2025-11-25 06:31:17.585 186245 DEBUG nova.network.neutron [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:31:19 compute-0 nova_compute[186241]: 2025-11-25 06:31:19.603 186245 DEBUG nova.network.neutron [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.106 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.106 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Instance network_info: |[{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.108 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Start _get_guest_xml network_info=[{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.111 186245 WARNING nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.112 186245 DEBUG nova.virt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1463597864', uuid='423a1897-c822-497b-a9e5-9127b1ec1b38'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052280.1122427) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.118 186245 DEBUG nova.virt.libvirt.host [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.118 186245 DEBUG nova.virt.libvirt.host [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.121 186245 DEBUG nova.virt.libvirt.host [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.121 186245 DEBUG nova.virt.libvirt.host [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.121 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.121 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.122 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.122 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.122 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.122 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.122 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.122 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.122 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.123 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.123 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.123 186245 DEBUG nova.virt.hardware [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.125 186245 DEBUG nova.virt.libvirt.vif [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:31:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1463597864',display_name='tempest-TestNetworkBasicOps-server-1463597864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1463597864',id=11,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB9YZGtnzOrlYGmISpA+unwY4gONLlAjFqumwC23Q7+Crw5i+LqLcVlVt/0sCZ6d5eQBauJ/jx+crkSOk9SVpdrxjSZtY8aMCmxwkmfsK+tTsRzoWni06YjvZ9ACgoxHlw==',key_name='tempest-TestNetworkBasicOps-1642248967',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4krqvcv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:31:11Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=423a1897-c822-497b-a9e5-9127b1ec1b38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.126 186245 DEBUG nova.network.os_vif_util [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.126 186245 DEBUG nova.network.os_vif_util [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=af54200a-3890-4538-8af0-4a157900fd41,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf54200a-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.127 186245 DEBUG nova.objects.instance [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 423a1897-c822-497b-a9e5-9127b1ec1b38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:31:20 compute-0 ovn_controller[95135]: 2025-11-25T06:31:20Z|00151|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.630 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <uuid>423a1897-c822-497b-a9e5-9127b1ec1b38</uuid>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <name>instance-0000000b</name>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-1463597864</nova:name>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:31:20</nova:creationTime>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         <nova:port uuid="af54200a-3890-4538-8af0-4a157900fd41">
Nov 25 06:31:20 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <system>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <entry name="serial">423a1897-c822-497b-a9e5-9127b1ec1b38</entry>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <entry name="uuid">423a1897-c822-497b-a9e5-9127b1ec1b38</entry>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </system>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <os>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   </os>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <features>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   </features>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk.config"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:15:b2:36"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <target dev="tapaf54200a-38"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/console.log" append="off"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <video>
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </video>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:31:20 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:31:20 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:31:20 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:31:20 compute-0 nova_compute[186241]: </domain>
Nov 25 06:31:20 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.630 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Preparing to wait for external event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.630 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.630 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.630 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.631 186245 DEBUG nova.virt.libvirt.vif [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:31:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1463597864',display_name='tempest-TestNetworkBasicOps-server-1463597864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1463597864',id=11,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB9YZGtnzOrlYGmISpA+unwY4gONLlAjFqumwC23Q7+Crw5i+LqLcVlVt/0sCZ6d5eQBauJ/jx+crkSOk9SVpdrxjSZtY8aMCmxwkmfsK+tTsRzoWni06YjvZ9ACgoxHlw==',key_name='tempest-TestNetworkBasicOps-1642248967',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4krqvcv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:31:11Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=423a1897-c822-497b-a9e5-9127b1ec1b38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.631 186245 DEBUG nova.network.os_vif_util [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.631 186245 DEBUG nova.network.os_vif_util [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=af54200a-3890-4538-8af0-4a157900fd41,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf54200a-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.632 186245 DEBUG os_vif [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=af54200a-3890-4538-8af0-4a157900fd41,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf54200a-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.632 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.632 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.632 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.633 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.633 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7173e9e0-eb45-55b1-af7e-f2c382caa710', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.634 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.636 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.638 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.638 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf54200a-38, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.638 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapaf54200a-38, col_values=(('qos', UUID('97cf69fd-301b-4d81-8cd4-3cfa528ee806')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.639 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapaf54200a-38, col_values=(('external_ids', {'iface-id': 'af54200a-3890-4538-8af0-4a157900fd41', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:b2:36', 'vm-uuid': '423a1897-c822-497b-a9e5-9127b1ec1b38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:20 compute-0 NetworkManager[55345]: <info>  [1764052280.6405] manager: (tapaf54200a-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.639 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.641 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.643 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:20 compute-0 nova_compute[186241]: 2025-11-25 06:31:20.643 186245 INFO os_vif [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=af54200a-3890-4538-8af0-4a157900fd41,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf54200a-38')
Nov 25 06:31:20 compute-0 podman[217310]: 2025-11-25 06:31:20.704946254 +0000 UTC m=+0.038675890 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Nov 25 06:31:22 compute-0 nova_compute[186241]: 2025-11-25 06:31:22.089 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:22 compute-0 nova_compute[186241]: 2025-11-25 06:31:22.168 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:31:22 compute-0 nova_compute[186241]: 2025-11-25 06:31:22.169 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:31:22 compute-0 nova_compute[186241]: 2025-11-25 06:31:22.169 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:15:b2:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:31:22 compute-0 nova_compute[186241]: 2025-11-25 06:31:22.169 186245 INFO nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Using config drive
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.570 186245 INFO nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Creating config drive at /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk.config
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.575 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp7q12pqr2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.692 186245 DEBUG oslo_concurrency.processutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp7q12pqr2" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:23 compute-0 kernel: tapaf54200a-38: entered promiscuous mode
Nov 25 06:31:23 compute-0 ovn_controller[95135]: 2025-11-25T06:31:23Z|00152|binding|INFO|Claiming lport af54200a-3890-4538-8af0-4a157900fd41 for this chassis.
Nov 25 06:31:23 compute-0 ovn_controller[95135]: 2025-11-25T06:31:23Z|00153|binding|INFO|af54200a-3890-4538-8af0-4a157900fd41: Claiming fa:16:3e:15:b2:36 10.100.0.4
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.729 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.733 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 NetworkManager[55345]: <info>  [1764052283.7347] manager: (tapaf54200a-38): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.738 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:b2:36 10.100.0.4'], port_security=['fa:16:3e:15:b2:36 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '423a1897-c822-497b-a9e5-9127b1ec1b38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '459b41c6-f1b9-460f-94da-cfc72f07e425', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=203ab58e-73f0-45ec-9572-3acf3a7b4768, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=af54200a-3890-4538-8af0-4a157900fd41) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.739 103953 INFO neutron.agent.ovn.metadata.agent [-] Port af54200a-3890-4538-8af0-4a157900fd41 in datapath 949ef554-1519-45e1-97c2-6c679a7a80e3 bound to our chassis
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.740 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 949ef554-1519-45e1-97c2-6c679a7a80e3
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.748 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9e114a68-660d-4f73-9feb-345f35862e9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.749 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap949ef554-11 in ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.750 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap949ef554-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.750 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[efd2956f-0848-417a-8c75-72590ed5695c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.751 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9db881db-152d-421a-a864-312942e0ea62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.758 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[b9916c78-c9ee-4590-ac34-48965af95f75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 systemd-udevd[217345]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:31:23 compute-0 NetworkManager[55345]: <info>  [1764052283.7681] device (tapaf54200a-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:31:23 compute-0 NetworkManager[55345]: <info>  [1764052283.7689] device (tapaf54200a-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:31:23 compute-0 systemd-machined[152921]: New machine qemu-11-instance-0000000b.
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.789 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[026ab8b9-62aa-4a99-b823-c744bc24b731]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.790 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Nov 25 06:31:23 compute-0 ovn_controller[95135]: 2025-11-25T06:31:23Z|00154|binding|INFO|Setting lport af54200a-3890-4538-8af0-4a157900fd41 ovn-installed in OVS
Nov 25 06:31:23 compute-0 ovn_controller[95135]: 2025-11-25T06:31:23Z|00155|binding|INFO|Setting lport af54200a-3890-4538-8af0-4a157900fd41 up in Southbound
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.796 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.813 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[619dca5f-1222-4b4e-8765-ded9b693410a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 NetworkManager[55345]: <info>  [1764052283.8166] manager: (tap949ef554-10): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.816 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e2e601-c915-478d-a73a-a341adc03e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.843 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[d2eb407b-85cd-45d9-94b6-fce739424911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.845 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[7b38e0d7-eb6d-446c-b90c-1368e47fc735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 NetworkManager[55345]: <info>  [1764052283.8638] device (tap949ef554-10): carrier: link connected
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.867 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[4d01b484-5e1b-4034-85b0-f46a3f808226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.881 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f7cbe66a-3e5a-4872-9587-2d4da0be33c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap949ef554-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:9d:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327226, 'reachable_time': 40216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217371, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.892 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[216f3712-37dd-4f48-91c3-064acbd250a0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:9da8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327226, 'tstamp': 327226}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217372, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.905 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[36e023be-b0bf-4270-8921-5713e2dc6d2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap949ef554-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:9d:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327226, 'reachable_time': 40216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217373, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.927 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[6be42356-4626-44fe-b565-a1a45c4ce53d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.968 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[32ecc524-ef01-422c-994b-707f912d1f56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.969 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap949ef554-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.969 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.969 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap949ef554-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.971 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 NetworkManager[55345]: <info>  [1764052283.9714] manager: (tap949ef554-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 25 06:31:23 compute-0 kernel: tap949ef554-10: entered promiscuous mode
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.976 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap949ef554-10, col_values=(('external_ids', {'iface-id': '4da4604e-e99e-4437-9f7c-6c43af2847ac'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.977 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.978 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 ovn_controller[95135]: 2025-11-25T06:31:23Z|00156|binding|INFO|Releasing lport 4da4604e-e99e-4437-9f7c-6c43af2847ac from this chassis (sb_readonly=0)
Nov 25 06:31:23 compute-0 nova_compute[186241]: 2025-11-25 06:31:23.990 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.991 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8089727e-fa26-4ea7-8aab-8e3fe90e6246]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.992 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.992 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.992 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 949ef554-1519-45e1-97c2-6c679a7a80e3 disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.992 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.993 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d643704c-33b6-49a7-979b-e91a40e45b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.993 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.993 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[213c63e3-db02-4360-a22c-bd49c2c4aead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.994 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-949ef554-1519-45e1-97c2-6c679a7a80e3
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID 949ef554-1519-45e1-97c2-6c679a7a80e3
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:31:23 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:23.994 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'env', 'PROCESS_TAG=haproxy-949ef554-1519-45e1-97c2-6c679a7a80e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/949ef554-1519-45e1-97c2-6c679a7a80e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.029 186245 DEBUG nova.compute.manager [req-950b61da-59af-4c7a-a603-8ab40a623999 req-c146944d-1727-4c5c-8142-8966bd866fa6 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.029 186245 DEBUG oslo_concurrency.lockutils [req-950b61da-59af-4c7a-a603-8ab40a623999 req-c146944d-1727-4c5c-8142-8966bd866fa6 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.030 186245 DEBUG oslo_concurrency.lockutils [req-950b61da-59af-4c7a-a603-8ab40a623999 req-c146944d-1727-4c5c-8142-8966bd866fa6 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.030 186245 DEBUG oslo_concurrency.lockutils [req-950b61da-59af-4c7a-a603-8ab40a623999 req-c146944d-1727-4c5c-8142-8966bd866fa6 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.030 186245 DEBUG nova.compute.manager [req-950b61da-59af-4c7a-a603-8ab40a623999 req-c146944d-1727-4c5c-8142-8966bd866fa6 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Processing event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:31:24 compute-0 podman[217402]: 2025-11-25 06:31:24.275909501 +0000 UTC m=+0.036107740 container create 8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 06:31:24 compute-0 systemd[1]: Started libpod-conmon-8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09.scope.
Nov 25 06:31:24 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:31:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b2718eaaed4fd368860f42b56c4a0b13ba83498a81b1c01205bede26071e013/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:31:24 compute-0 podman[217402]: 2025-11-25 06:31:24.342308211 +0000 UTC m=+0.102506460 container init 8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 25 06:31:24 compute-0 podman[217402]: 2025-11-25 06:31:24.347312792 +0000 UTC m=+0.107511021 container start 8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 06:31:24 compute-0 podman[217402]: 2025-11-25 06:31:24.261197008 +0000 UTC m=+0.021395257 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:31:24 compute-0 neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3[217414]: [NOTICE]   (217418) : New worker (217420) forked
Nov 25 06:31:24 compute-0 neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3[217414]: [NOTICE]   (217418) : Loading success.
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.522 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.526 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.528 186245 INFO nova.virt.libvirt.driver [-] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Instance spawned successfully.
Nov 25 06:31:24 compute-0 nova_compute[186241]: 2025-11-25 06:31:24.529 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.037 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.038 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.038 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.039 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.039 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.039 186245 DEBUG nova.virt.libvirt.driver [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.545 186245 INFO nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Took 12.98 seconds to spawn the instance on the hypervisor.
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.545 186245 DEBUG nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.640 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:25 compute-0 nova_compute[186241]: 2025-11-25 06:31:25.938 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.059 186245 INFO nova.compute.manager [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Took 18.09 seconds to build instance.
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.302 186245 DEBUG nova.compute.manager [req-a603eb7e-042a-4bd9-a7d4-46a8194897d7 req-9bdbc1f1-add0-40f5-bfe0-85f182de8579 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.303 186245 DEBUG oslo_concurrency.lockutils [req-a603eb7e-042a-4bd9-a7d4-46a8194897d7 req-9bdbc1f1-add0-40f5-bfe0-85f182de8579 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.303 186245 DEBUG oslo_concurrency.lockutils [req-a603eb7e-042a-4bd9-a7d4-46a8194897d7 req-9bdbc1f1-add0-40f5-bfe0-85f182de8579 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.303 186245 DEBUG oslo_concurrency.lockutils [req-a603eb7e-042a-4bd9-a7d4-46a8194897d7 req-9bdbc1f1-add0-40f5-bfe0-85f182de8579 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.303 186245 DEBUG nova.compute.manager [req-a603eb7e-042a-4bd9-a7d4-46a8194897d7 req-9bdbc1f1-add0-40f5-bfe0-85f182de8579 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] No waiting events found dispatching network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.303 186245 WARNING nova.compute.manager [req-a603eb7e-042a-4bd9-a7d4-46a8194897d7 req-9bdbc1f1-add0-40f5-bfe0-85f182de8579 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received unexpected event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 for instance with vm_state active and task_state None.
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.562 186245 DEBUG oslo_concurrency.lockutils [None req-daf47430-413d-4c49-89bb-b4088c71570b 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:26 compute-0 nova_compute[186241]: 2025-11-25 06:31:26.933 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:27 compute-0 podman[217432]: 2025-11-25 06:31:27.066753031 +0000 UTC m=+0.042841289 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:31:27 compute-0 nova_compute[186241]: 2025-11-25 06:31:27.091 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:28 compute-0 nova_compute[186241]: 2025-11-25 06:31:28.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:29 compute-0 nova_compute[186241]: 2025-11-25 06:31:29.440 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:29 compute-0 nova_compute[186241]: 2025-11-25 06:31:29.440 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:29 compute-0 nova_compute[186241]: 2025-11-25 06:31:29.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:29 compute-0 nova_compute[186241]: 2025-11-25 06:31:29.441 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.466 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.479 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:30 compute-0 NetworkManager[55345]: <info>  [1764052290.4887] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 25 06:31:30 compute-0 NetworkManager[55345]: <info>  [1764052290.4892] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Nov 25 06:31:30 compute-0 ovn_controller[95135]: 2025-11-25T06:31:30Z|00157|binding|INFO|Releasing lport 4da4604e-e99e-4437-9f7c-6c43af2847ac from this chassis (sb_readonly=0)
Nov 25 06:31:30 compute-0 ovn_controller[95135]: 2025-11-25T06:31:30Z|00158|binding|INFO|Releasing lport 4da4604e-e99e-4437-9f7c-6c43af2847ac from this chassis (sb_readonly=0)
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.512 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.516 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.517 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.518 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.572 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.642 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.644 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:30.644 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:31:30 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:30.645 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.776 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.777 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5645MB free_disk=73.01697158813477GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.777 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.778 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.855 186245 DEBUG nova.compute.manager [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-changed-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.855 186245 DEBUG nova.compute.manager [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing instance network info cache due to event network-changed-af54200a-3890-4538-8af0-4a157900fd41. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.855 186245 DEBUG oslo_concurrency.lockutils [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.855 186245 DEBUG oslo_concurrency.lockutils [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:31:30 compute-0 nova_compute[186241]: 2025-11-25 06:31:30.855 186245 DEBUG nova.network.neutron [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing network info cache for port af54200a-3890-4538-8af0-4a157900fd41 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:31:31 compute-0 podman[217459]: 2025-11-25 06:31:31.060986814 +0000 UTC m=+0.040451647 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:31:31 compute-0 nova_compute[186241]: 2025-11-25 06:31:31.971 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 423a1897-c822-497b-a9e5-9127b1ec1b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:31:31 compute-0 nova_compute[186241]: 2025-11-25 06:31:31.971 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:31:31 compute-0 nova_compute[186241]: 2025-11-25 06:31:31.972 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:31:32 compute-0 nova_compute[186241]: 2025-11-25 06:31:32.093 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:32 compute-0 nova_compute[186241]: 2025-11-25 06:31:32.316 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:31:32 compute-0 nova_compute[186241]: 2025-11-25 06:31:32.820 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:31:33 compute-0 nova_compute[186241]: 2025-11-25 06:31:33.327 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:31:33 compute-0 nova_compute[186241]: 2025-11-25 06:31:33.327 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:34 compute-0 nova_compute[186241]: 2025-11-25 06:31:34.323 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:34 compute-0 nova_compute[186241]: 2025-11-25 06:31:34.324 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:34 compute-0 nova_compute[186241]: 2025-11-25 06:31:34.828 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:34 compute-0 nova_compute[186241]: 2025-11-25 06:31:34.829 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:34 compute-0 nova_compute[186241]: 2025-11-25 06:31:34.829 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:31:34 compute-0 nova_compute[186241]: 2025-11-25 06:31:34.829 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:31:35 compute-0 ovn_controller[95135]: 2025-11-25T06:31:35Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:b2:36 10.100.0.4
Nov 25 06:31:35 compute-0 ovn_controller[95135]: 2025-11-25T06:31:35Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:b2:36 10.100.0.4
Nov 25 06:31:35 compute-0 nova_compute[186241]: 2025-11-25 06:31:35.644 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:36 compute-0 nova_compute[186241]: 2025-11-25 06:31:36.604 186245 DEBUG nova.network.neutron [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updated VIF entry in instance network info cache for port af54200a-3890-4538-8af0-4a157900fd41. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:31:36 compute-0 nova_compute[186241]: 2025-11-25 06:31:36.605 186245 DEBUG nova.network.neutron [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:31:37 compute-0 nova_compute[186241]: 2025-11-25 06:31:37.094 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:37 compute-0 nova_compute[186241]: 2025-11-25 06:31:37.108 186245 DEBUG oslo_concurrency.lockutils [req-10d9ac86-64a7-483a-b0fa-893dd5103a9f req-279c1f2e-e965-4e22-bf54-aec409700f55 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:31:39 compute-0 podman[217486]: 2025-11-25 06:31:39.077894649 +0000 UTC m=+0.057670493 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Nov 25 06:31:39 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:39.647 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:40 compute-0 nova_compute[186241]: 2025-11-25 06:31:40.646 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:42 compute-0 nova_compute[186241]: 2025-11-25 06:31:42.095 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:43 compute-0 podman[217510]: 2025-11-25 06:31:43.064977494 +0000 UTC m=+0.041462861 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:31:43 compute-0 podman[217509]: 2025-11-25 06:31:43.065320201 +0000 UTC m=+0.044088751 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 06:31:43 compute-0 nova_compute[186241]: 2025-11-25 06:31:43.784 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:43 compute-0 nova_compute[186241]: 2025-11-25 06:31:43.784 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:44 compute-0 nova_compute[186241]: 2025-11-25 06:31:44.287 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:31:44 compute-0 nova_compute[186241]: 2025-11-25 06:31:44.814 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:44 compute-0 nova_compute[186241]: 2025-11-25 06:31:44.815 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:44 compute-0 nova_compute[186241]: 2025-11-25 06:31:44.820 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:31:44 compute-0 nova_compute[186241]: 2025-11-25 06:31:44.820 186245 INFO nova.compute.claims [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:31:45 compute-0 nova_compute[186241]: 2025-11-25 06:31:45.647 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:45 compute-0 nova_compute[186241]: 2025-11-25 06:31:45.881 186245 DEBUG nova.compute.provider_tree [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:31:46 compute-0 nova_compute[186241]: 2025-11-25 06:31:46.385 186245 DEBUG nova.scheduler.client.report [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:31:46 compute-0 nova_compute[186241]: 2025-11-25 06:31:46.890 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:46 compute-0 nova_compute[186241]: 2025-11-25 06:31:46.890 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:31:47 compute-0 podman[217547]: 2025-11-25 06:31:47.054964593 +0000 UTC m=+0.035108558 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:31:47 compute-0 nova_compute[186241]: 2025-11-25 06:31:47.096 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:47 compute-0 nova_compute[186241]: 2025-11-25 06:31:47.396 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:31:47 compute-0 nova_compute[186241]: 2025-11-25 06:31:47.397 186245 DEBUG nova.network.neutron [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:31:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:47.751 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:47.752 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:31:47.752 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:47 compute-0 nova_compute[186241]: 2025-11-25 06:31:47.900 186245 INFO nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:31:48 compute-0 nova_compute[186241]: 2025-11-25 06:31:48.125 186245 DEBUG nova.policy [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:31:48 compute-0 nova_compute[186241]: 2025-11-25 06:31:48.404 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.414 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.415 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.415 186245 INFO nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Creating image(s)
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.416 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.416 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.416 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.417 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.420 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.421 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.446 186245 DEBUG nova.network.neutron [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Successfully created port: 4b069377-07a8-4b40-a297-6a1002cb49d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.464 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.464 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.465 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.465 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.468 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.469 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.510 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.511 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.527 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk 1073741824" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.528 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.528 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.571 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.572 186245 DEBUG nova.virt.disk.api [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.572 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.626 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.627 186245 DEBUG nova.virt.disk.api [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.627 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.628 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Ensure instance console log exists: /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.628 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.628 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:49 compute-0 nova_compute[186241]: 2025-11-25 06:31:49.628 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.329 186245 DEBUG nova.network.neutron [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Successfully updated port: 4b069377-07a8-4b40-a297-6a1002cb49d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.484 186245 DEBUG nova.compute.manager [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-changed-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.485 186245 DEBUG nova.compute.manager [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Refreshing instance network info cache due to event network-changed-4b069377-07a8-4b40-a297-6a1002cb49d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.485 186245 DEBUG oslo_concurrency.lockutils [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.485 186245 DEBUG oslo_concurrency.lockutils [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.485 186245 DEBUG nova.network.neutron [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Refreshing network info cache for port 4b069377-07a8-4b40-a297-6a1002cb49d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.648 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:50 compute-0 nova_compute[186241]: 2025-11-25 06:31:50.833 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:31:51 compute-0 podman[217580]: 2025-11-25 06:31:51.059926211 +0000 UTC m=+0.039976781 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 06:31:51 compute-0 nova_compute[186241]: 2025-11-25 06:31:51.310 186245 DEBUG nova.network.neutron [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:31:51 compute-0 nova_compute[186241]: 2025-11-25 06:31:51.810 186245 DEBUG nova.network.neutron [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:31:52 compute-0 nova_compute[186241]: 2025-11-25 06:31:52.097 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:52 compute-0 nova_compute[186241]: 2025-11-25 06:31:52.314 186245 DEBUG oslo_concurrency.lockutils [req-bad95ff8-783b-4520-ac04-03377d0aa106 req-714c3b8a-25eb-4c79-b0f0-318b9712f9aa a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:31:52 compute-0 nova_compute[186241]: 2025-11-25 06:31:52.315 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:31:52 compute-0 nova_compute[186241]: 2025-11-25 06:31:52.315 186245 DEBUG nova.network.neutron [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:31:53 compute-0 nova_compute[186241]: 2025-11-25 06:31:53.311 186245 DEBUG nova.network.neutron [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:31:55 compute-0 nova_compute[186241]: 2025-11-25 06:31:55.649 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.099 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.313 186245 DEBUG nova.network.neutron [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updating instance_info_cache with network_info: [{"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.816 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.817 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Instance network_info: |[{"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.819 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Start _get_guest_xml network_info=[{"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.821 186245 WARNING nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.822 186245 DEBUG nova.virt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-646815711', uuid='afacab74-90bc-4c94-9989-57a24bca630d'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052317.822489) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.833 186245 DEBUG nova.virt.libvirt.host [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.833 186245 DEBUG nova.virt.libvirt.host [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.835 186245 DEBUG nova.virt.libvirt.host [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.836 186245 DEBUG nova.virt.libvirt.host [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.836 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.836 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.836 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.837 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.837 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.837 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.837 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.837 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.837 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.838 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.838 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.838 186245 DEBUG nova.virt.hardware [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.840 186245 DEBUG nova.virt.libvirt.vif [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:31:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-646815711',display_name='tempest-TestNetworkBasicOps-server-646815711',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-646815711',id=12,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIw+NJqacZin6vyCp8LqtKU0xN0GX9tRvX6CN4wtS5FXFTz8F9RxRd3usS1v0JTZDi+00cXkfPnzlD8TexQGqGFEfiIfSkTFaJPyRAGOO4rKmljfz0Is5i/68e0r318Nsg==',key_name='tempest-TestNetworkBasicOps-1173617577',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-on805j2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:31:48Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=afacab74-90bc-4c94-9989-57a24bca630d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.841 186245 DEBUG nova.network.os_vif_util [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.841 186245 DEBUG nova.network.os_vif_util [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:aa:1e,bridge_name='br-int',has_traffic_filtering=True,id=4b069377-07a8-4b40-a297-6a1002cb49d0,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b069377-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:31:57 compute-0 nova_compute[186241]: 2025-11-25 06:31:57.842 186245 DEBUG nova.objects.instance [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid afacab74-90bc-4c94-9989-57a24bca630d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:31:58 compute-0 podman[217598]: 2025-11-25 06:31:58.065116755 +0000 UTC m=+0.043005470 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.346 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <uuid>afacab74-90bc-4c94-9989-57a24bca630d</uuid>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <name>instance-0000000c</name>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-646815711</nova:name>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:31:57</nova:creationTime>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         <nova:port uuid="4b069377-07a8-4b40-a297-6a1002cb49d0">
Nov 25 06:31:58 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <system>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <entry name="serial">afacab74-90bc-4c94-9989-57a24bca630d</entry>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <entry name="uuid">afacab74-90bc-4c94-9989-57a24bca630d</entry>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </system>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <os>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   </os>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <features>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   </features>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk.config"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:7d:aa:1e"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <target dev="tap4b069377-07"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/console.log" append="off"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <video>
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </video>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:31:58 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:31:58 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:31:58 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:31:58 compute-0 nova_compute[186241]: </domain>
Nov 25 06:31:58 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.347 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Preparing to wait for external event network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.347 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.347 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.347 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.348 186245 DEBUG nova.virt.libvirt.vif [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:31:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-646815711',display_name='tempest-TestNetworkBasicOps-server-646815711',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-646815711',id=12,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIw+NJqacZin6vyCp8LqtKU0xN0GX9tRvX6CN4wtS5FXFTz8F9RxRd3usS1v0JTZDi+00cXkfPnzlD8TexQGqGFEfiIfSkTFaJPyRAGOO4rKmljfz0Is5i/68e0r318Nsg==',key_name='tempest-TestNetworkBasicOps-1173617577',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-on805j2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:31:48Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=afacab74-90bc-4c94-9989-57a24bca630d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.348 186245 DEBUG nova.network.os_vif_util [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.349 186245 DEBUG nova.network.os_vif_util [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:aa:1e,bridge_name='br-int',has_traffic_filtering=True,id=4b069377-07a8-4b40-a297-6a1002cb49d0,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b069377-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.349 186245 DEBUG os_vif [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:aa:1e,bridge_name='br-int',has_traffic_filtering=True,id=4b069377-07a8-4b40-a297-6a1002cb49d0,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b069377-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.349 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.350 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.350 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.350 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.351 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd7b7f725-57ee-5bc0-9048-1ac2e165c28f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.351 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.353 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.355 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.355 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b069377-07, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.356 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4b069377-07, col_values=(('qos', UUID('1920e5c6-7ade-424c-a211-5261889c4a45')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.356 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4b069377-07, col_values=(('external_ids', {'iface-id': '4b069377-07a8-4b40-a297-6a1002cb49d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:aa:1e', 'vm-uuid': 'afacab74-90bc-4c94-9989-57a24bca630d'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.357 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:58 compute-0 NetworkManager[55345]: <info>  [1764052318.3577] manager: (tap4b069377-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.359 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.362 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:31:58 compute-0 nova_compute[186241]: 2025-11-25 06:31:58.362 186245 INFO os_vif [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:aa:1e,bridge_name='br-int',has_traffic_filtering=True,id=4b069377-07a8-4b40-a297-6a1002cb49d0,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b069377-07')
Nov 25 06:31:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:31:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:31:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:31:59.554 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/afacab74-90bc-4c94-9989-57a24bca630d -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e471cc3fc7ae9ac5d8fd794e8aefa20e5f5c77c3e3edccb41964d2d46a7818d3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Nov 25 06:31:59 compute-0 nova_compute[186241]: 2025-11-25 06:31:59.886 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:31:59 compute-0 nova_compute[186241]: 2025-11-25 06:31:59.886 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:31:59 compute-0 nova_compute[186241]: 2025-11-25 06:31:59.887 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:7d:aa:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:31:59 compute-0 nova_compute[186241]: 2025-11-25 06:31:59.887 186245 INFO nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Using config drive
Nov 25 06:32:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:00.631 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1680 Content-Type: application/json Date: Tue, 25 Nov 2025 06:31:59 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-4f55d058-0383-45db-8f7a-88888b91ac15 x-openstack-request-id: req-4f55d058-0383-45db-8f7a-88888b91ac15 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Nov 25 06:32:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:00.632 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "afacab74-90bc-4c94-9989-57a24bca630d", "name": "tempest-TestNetworkBasicOps-server-646815711", "status": "BUILD", "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "user_id": "66a05d0ca82146a5a458244c8e5364de", "metadata": {}, "hostId": "d6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5", "image": {"id": "5215c26e-be2f-40b4-ac47-476bfa3cf3f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/5215c26e-be2f-40b4-ac47-476bfa3cf3f2"}]}, "flavor": {"id": "53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac"}]}, "created": "2025-11-25T06:31:42Z", "updated": "2025-11-25T06:31:49Z", "addresses": {}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/afacab74-90bc-4c94-9989-57a24bca630d"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/afacab74-90bc-4c94-9989-57a24bca630d"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "", "key_name": "tempest-TestNetworkBasicOps-1173617577", "OS-SRV-USG:launched_at": null, "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-208906488"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-0000000c", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": "spawning", "OS-EXT-STS:vm_state": "building", "OS-EXT-STS:power_state": 0, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Nov 25 06:32:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:00.632 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/afacab74-90bc-4c94-9989-57a24bca630d used request id req-4f55d058-0383-45db-8f7a-88888b91ac15 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Nov 25 06:32:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:00.632 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'afacab74-90bc-4c94-9989-57a24bca630d', 'name': 'tempest-TestNetworkBasicOps-server-646815711', 'flavor': {'id': '53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'd90b557db9104ecfb816b1cdab8712bd', 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'hostId': 'd6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Nov 25 06:32:00 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:00.634 16 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/423a1897-c822-497b-a9e5-9127b1ec1b38 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e471cc3fc7ae9ac5d8fd794e8aefa20e5f5c77c3e3edccb41964d2d46a7818d3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:580
Nov 25 06:32:00 compute-0 nova_compute[186241]: 2025-11-25 06:32:00.986 186245 INFO nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Creating config drive at /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk.config
Nov 25 06:32:00 compute-0 nova_compute[186241]: 2025-11-25 06:32:00.991 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprgxuynq9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.108 186245 DEBUG oslo_concurrency.processutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmprgxuynq9" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:32:01 compute-0 kernel: tap4b069377-07: entered promiscuous mode
Nov 25 06:32:01 compute-0 NetworkManager[55345]: <info>  [1764052321.1539] manager: (tap4b069377-07): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 25 06:32:01 compute-0 ovn_controller[95135]: 2025-11-25T06:32:01Z|00159|binding|INFO|Claiming lport 4b069377-07a8-4b40-a297-6a1002cb49d0 for this chassis.
Nov 25 06:32:01 compute-0 ovn_controller[95135]: 2025-11-25T06:32:01Z|00160|binding|INFO|4b069377-07a8-4b40-a297-6a1002cb49d0: Claiming fa:16:3e:7d:aa:1e 10.100.0.14
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.157 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.162 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:aa:1e 10.100.0.14'], port_security=['fa:16:3e:7d:aa:1e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'afacab74-90bc-4c94-9989-57a24bca630d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f4be2d4a-1533-46ed-87bc-ac265b9010e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=203ab58e-73f0-45ec-9572-3acf3a7b4768, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=4b069377-07a8-4b40-a297-6a1002cb49d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.163 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 4b069377-07a8-4b40-a297-6a1002cb49d0 in datapath 949ef554-1519-45e1-97c2-6c679a7a80e3 bound to our chassis
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.164 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 949ef554-1519-45e1-97c2-6c679a7a80e3
Nov 25 06:32:01 compute-0 ovn_controller[95135]: 2025-11-25T06:32:01Z|00161|binding|INFO|Setting lport 4b069377-07a8-4b40-a297-6a1002cb49d0 ovn-installed in OVS
Nov 25 06:32:01 compute-0 ovn_controller[95135]: 2025-11-25T06:32:01Z|00162|binding|INFO|Setting lport 4b069377-07a8-4b40-a297-6a1002cb49d0 up in Southbound
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.176 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.182 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a42ae3cb-6635-4f91-a9f2-d9b52aa1b931]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:01 compute-0 systemd-udevd[217643]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:32:01 compute-0 NetworkManager[55345]: <info>  [1764052321.1978] device (tap4b069377-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:32:01 compute-0 NetworkManager[55345]: <info>  [1764052321.1986] device (tap4b069377-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:32:01 compute-0 systemd-machined[152921]: New machine qemu-12-instance-0000000c.
Nov 25 06:32:01 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.206 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[8b417d60-78ca-4df4-bea9-193d913726c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.209 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e7cc8d-751d-43e7-a292-88e2b92d1f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.230 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[6298fa38-9256-42e6-bc48-ec3d231f0d0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:01 compute-0 podman[217627]: 2025-11-25 06:32:01.244003195 +0000 UTC m=+0.091706860 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.246 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5f50d4-16af-4211-949d-815e05dfb0ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap949ef554-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:9d:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327226, 'reachable_time': 40216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217665, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.260 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[63a1efe3-fb1e-4c8a-82bc-fa7e297c1815]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap949ef554-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327234, 'tstamp': 327234}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217670, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap949ef554-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327236, 'tstamp': 327236}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217670, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.261 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap949ef554-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.262 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.263 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap949ef554-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.264 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.264 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap949ef554-10, col_values=(('external_ids', {'iface-id': '4da4604e-e99e-4437-9f7c-6c43af2847ac'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.264 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:32:01 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:01.265 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[de94b564-c516-4e46-acb7-c496f71efd80]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-949ef554-1519-45e1-97c2-6c679a7a80e3\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 949ef554-1519-45e1-97c2-6c679a7a80e3\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.417 16 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1976 Content-Type: application/json Date: Tue, 25 Nov 2025 06:32:00 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-fc4c855a-a0fc-478e-b36d-0b10e764ca52 x-openstack-request-id: req-fc4c855a-a0fc-478e-b36d-0b10e764ca52 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:621
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.418 16 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "423a1897-c822-497b-a9e5-9127b1ec1b38", "name": "tempest-TestNetworkBasicOps-server-1463597864", "status": "ACTIVE", "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "user_id": "66a05d0ca82146a5a458244c8e5364de", "metadata": {}, "hostId": "d6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5", "image": {"id": "5215c26e-be2f-40b4-ac47-476bfa3cf3f2", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/5215c26e-be2f-40b4-ac47-476bfa3cf3f2"}]}, "flavor": {"id": "53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac"}]}, "created": "2025-11-25T06:31:05Z", "updated": "2025-11-25T06:31:25Z", "addresses": {"tempest-network-smoke--2111457677": [{"version": 4, "addr": "10.100.0.4", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:15:b2:36"}, {"version": 4, "addr": "192.168.122.211", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:15:b2:36"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/423a1897-c822-497b-a9e5-9127b1ec1b38"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/423a1897-c822-497b-a9e5-9127b1ec1b38"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-1642248967", "OS-SRV-USG:launched_at": "2025-11-25T06:31:25.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-1379323936"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-0000000b", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:656
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.418 16 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/423a1897-c822-497b-a9e5-9127b1ec1b38 used request id req-fc4c855a-a0fc-478e-b36d-0b10e764ca52 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:1081
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.419 16 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '423a1897-c822-497b-a9e5-9127b1ec1b38', 'name': 'tempest-TestNetworkBasicOps-server-1463597864', 'flavor': {'id': '53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd90b557db9104ecfb816b1cdab8712bd', 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'hostId': 'd6415e4488baf9498ba266263ffa8171c87827cb743dadd0ee29aff5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:226
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.419 16 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.420 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.420 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800ca460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.420 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:01 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:01.420 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2025-11-25T06:32:01.420226) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.552 186245 DEBUG nova.compute.manager [req-941239b6-ffbf-4bdb-9f78-b65b843597ac req-6ab2032a-f89d-48e5-af64-d25b579c568c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.552 186245 DEBUG oslo_concurrency.lockutils [req-941239b6-ffbf-4bdb-9f78-b65b843597ac req-6ab2032a-f89d-48e5-af64-d25b579c568c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.553 186245 DEBUG oslo_concurrency.lockutils [req-941239b6-ffbf-4bdb-9f78-b65b843597ac req-6ab2032a-f89d-48e5-af64-d25b579c568c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.554 186245 DEBUG oslo_concurrency.lockutils [req-941239b6-ffbf-4bdb-9f78-b65b843597ac req-6ab2032a-f89d-48e5-af64-d25b579c568c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:01 compute-0 nova_compute[186241]: 2025-11-25 06:32:01.554 186245 DEBUG nova.compute.manager [req-941239b6-ffbf-4bdb-9f78-b65b843597ac req-6ab2032a-f89d-48e5-af64-d25b579c568c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Processing event network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.101 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.178 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.183 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.186 186245 INFO nova.virt.libvirt.driver [-] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Instance spawned successfully.
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.186 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.194 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.206 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/power.state volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.206 16 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.207 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.207 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.207 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.207 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c42b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.207 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.208 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2025-11-25T06:32:02.207923) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.209 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for afacab74-90bc-4c94-9989-57a24bca630d / tap4b069377-07 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.209 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.211 16 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 423a1897-c822-497b-a9e5-9127b1ec1b38 / tapaf54200a-38 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.211 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.212 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.212 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.212 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.212 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.212 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4af0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.212 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.213 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.213 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2025-11-25T06:32:02.212913) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.213 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.213 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.214 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.214 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.214 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.214 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b23a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.214 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.214 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2025-11-25T06:32:02.214779) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.232 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.232 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.249 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.read.requests volume: 1075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.250 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.read.requests volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.250 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.250 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.250 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.251 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.251 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.251 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.251 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2025-11-25T06:32:02.251474) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.251 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.252 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.252 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.write.latency volume: 387751894 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.252 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.253 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.253 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.253 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.253 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.253 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4490>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.254 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.254 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2025-11-25T06:32:02.253964) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.254 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.254 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.255 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.255 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.255 16 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.255 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.255 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2a60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.255 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.256 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2025-11-25T06:32:02.255858) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.256 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.256 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.256 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.257 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.257 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2520>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.257 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.257 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2025-11-25T06:32:02.257348) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.257 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.258 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.258 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.read.bytes volume: 29919744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.258 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.read.bytes volume: 284990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.259 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.259 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.259 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.259 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.259 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4910>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.259 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.260 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2025-11-25T06:32:02.259895) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.260 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.260 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.260 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.261 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.261 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.261 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.261 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4100>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.261 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.261 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2025-11-25T06:32:02.261784) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.262 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.262 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.262 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.263 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.263 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.263 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.263 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4c70>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.263 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.263 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2025-11-25T06:32:02.263796) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.264 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.264 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.264 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.264 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.265 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.265 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.265 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4d30>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.265 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.265 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2025-11-25T06:32:02.265673) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.266 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.266 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.266 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.266 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.267 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.267 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.267 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c44f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.267 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.267 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2025-11-25T06:32:02.267566) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.267 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.268 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.268 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.268 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.268 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.269 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.269 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4760>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.269 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.269 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2025-11-25T06:32:02.269439) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.269 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.269 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-646815711>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1463597864>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-646815711>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1463597864>]
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.270 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.270 16 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.270 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.270 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4310>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.271 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.271 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2025-11-25T06:32:02.270976) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.271 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.271 16 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance afacab74-90bc-4c94-9989-57a24bca630d: ceilometer.compute.pollsters.NoVolumeException
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.271 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/memory.usage volume: 42.36328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.272 16 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.272 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.272 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.272 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.272 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b25e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.272 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.273 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2025-11-25T06:32:02.272866) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.273 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.273 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.273 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.read.latency volume: 192488534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.274 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.read.latency volume: 25174138 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.274 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.274 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.274 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.275 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.275 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2d00>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.275 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.275 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2025-11-25T06:32:02.275377) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.275 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.276 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.276 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.write.bytes volume: 72966144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.276 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.277 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.277 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.277 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.277 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.277 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4070>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.277 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.278 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2025-11-25T06:32:02.277898) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.278 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.278 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.278 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.279 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.279 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.279 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.279 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b28e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.279 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.279 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2025-11-25T06:32:02.279775) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.286 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.287 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.293 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.293 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.capacity volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.294 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.294 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.294 16 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.294 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.295 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800afdc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.295 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.295 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2025-11-25T06:32:02.295232) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.295 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.295 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/cpu volume: 9760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.296 16 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.296 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.296 16 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.296 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.296 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4bb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.297 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.297 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2025-11-25T06:32:02.297115) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.297 16 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.297 16 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-646815711>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1463597864>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-646815711>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1463597864>]
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.297 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.298 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.298 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.298 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.298 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.298 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2025-11-25T06:32:02.298564) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.298 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.299 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.299 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.299 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.usage volume: 497664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.300 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.300 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.300 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.300 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.300 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2f40>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.301 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.301 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2025-11-25T06:32:02.301042) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.301 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.301 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.301 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.write.requests volume: 325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.302 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.302 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.302 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.303 16 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.303 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.303 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800c4700>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.303 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.303 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2025-11-25T06:32:02.303566) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.303 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.304 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.304 16 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.304 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.304 16 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.305 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.305 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2970>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.305 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.305 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2025-11-25T06:32:02.305472) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.305 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.306 16 DEBUG ceilometer.compute.pollsters [-] afacab74-90bc-4c94-9989-57a24bca630d/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.306 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.306 16 DEBUG ceilometer.compute.pollsters [-] 423a1897-c822-497b-a9e5-9127b1ec1b38/disk.device.allocation volume: 499712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.307 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.307 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.307 16 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.307 16 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] with coordination group name [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:248
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.307 16 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7ff3800b2b20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:270
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.307 16 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:531
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.308 14 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2025-11-25T06:32:02.307939) _update_status /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:417
Nov 25 06:32:02 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:32:02.308 16 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.695 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.695 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.695 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.695 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.696 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:32:02 compute-0 nova_compute[186241]: 2025-11-25 06:32:02.696 186245 DEBUG nova.virt.libvirt.driver [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.207 186245 INFO nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Took 13.79 seconds to spawn the instance on the hypervisor.
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.208 186245 DEBUG nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.357 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.719 186245 INFO nova.compute.manager [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Took 18.93 seconds to build instance.
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.725 186245 DEBUG nova.compute.manager [req-4fdfbf68-6117-48af-83f3-ce055227d0c2 req-5e29cd20-2d58-4108-91b7-9e052ea32557 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.725 186245 DEBUG oslo_concurrency.lockutils [req-4fdfbf68-6117-48af-83f3-ce055227d0c2 req-5e29cd20-2d58-4108-91b7-9e052ea32557 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.725 186245 DEBUG oslo_concurrency.lockutils [req-4fdfbf68-6117-48af-83f3-ce055227d0c2 req-5e29cd20-2d58-4108-91b7-9e052ea32557 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.725 186245 DEBUG oslo_concurrency.lockutils [req-4fdfbf68-6117-48af-83f3-ce055227d0c2 req-5e29cd20-2d58-4108-91b7-9e052ea32557 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.726 186245 DEBUG nova.compute.manager [req-4fdfbf68-6117-48af-83f3-ce055227d0c2 req-5e29cd20-2d58-4108-91b7-9e052ea32557 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] No waiting events found dispatching network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:03 compute-0 nova_compute[186241]: 2025-11-25 06:32:03.726 186245 WARNING nova.compute.manager [req-4fdfbf68-6117-48af-83f3-ce055227d0c2 req-5e29cd20-2d58-4108-91b7-9e052ea32557 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received unexpected event network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 for instance with vm_state active and task_state None.
Nov 25 06:32:04 compute-0 nova_compute[186241]: 2025-11-25 06:32:04.222 186245 DEBUG oslo_concurrency.lockutils [None req-0b02d5cc-d2ec-4954-a23c-e734c5a08611 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:07 compute-0 nova_compute[186241]: 2025-11-25 06:32:07.102 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:08 compute-0 nova_compute[186241]: 2025-11-25 06:32:08.359 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:09 compute-0 nova_compute[186241]: 2025-11-25 06:32:09.161 186245 DEBUG nova.compute.manager [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-changed-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:09 compute-0 nova_compute[186241]: 2025-11-25 06:32:09.161 186245 DEBUG nova.compute.manager [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Refreshing instance network info cache due to event network-changed-4b069377-07a8-4b40-a297-6a1002cb49d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:32:09 compute-0 nova_compute[186241]: 2025-11-25 06:32:09.161 186245 DEBUG oslo_concurrency.lockutils [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:32:09 compute-0 nova_compute[186241]: 2025-11-25 06:32:09.162 186245 DEBUG oslo_concurrency.lockutils [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:32:09 compute-0 nova_compute[186241]: 2025-11-25 06:32:09.162 186245 DEBUG nova.network.neutron [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Refreshing network info cache for port 4b069377-07a8-4b40-a297-6a1002cb49d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:32:10 compute-0 podman[217682]: 2025-11-25 06:32:10.085983067 +0000 UTC m=+0.062650789 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3)
Nov 25 06:32:12 compute-0 nova_compute[186241]: 2025-11-25 06:32:12.103 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:12 compute-0 ovn_controller[95135]: 2025-11-25T06:32:12Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:aa:1e 10.100.0.14
Nov 25 06:32:12 compute-0 ovn_controller[95135]: 2025-11-25T06:32:12Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:aa:1e 10.100.0.14
Nov 25 06:32:13 compute-0 nova_compute[186241]: 2025-11-25 06:32:13.362 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:14 compute-0 podman[217718]: 2025-11-25 06:32:14.069909866 +0000 UTC m=+0.042101154 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=multipathd)
Nov 25 06:32:14 compute-0 podman[217719]: 2025-11-25 06:32:14.073962222 +0000 UTC m=+0.043491786 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 06:32:14 compute-0 nova_compute[186241]: 2025-11-25 06:32:14.319 186245 DEBUG nova.network.neutron [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updated VIF entry in instance network info cache for port 4b069377-07a8-4b40-a297-6a1002cb49d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:32:14 compute-0 nova_compute[186241]: 2025-11-25 06:32:14.320 186245 DEBUG nova.network.neutron [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updating instance_info_cache with network_info: [{"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:14 compute-0 nova_compute[186241]: 2025-11-25 06:32:14.823 186245 DEBUG oslo_concurrency.lockutils [req-44bba48d-855b-43a0-8782-d50acb5959c7 req-313cf84b-bd2c-4ba1-88a1-8b43828f9731 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:32:17 compute-0 nova_compute[186241]: 2025-11-25 06:32:17.106 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:18 compute-0 podman[217757]: 2025-11-25 06:32:18.0547839 +0000 UTC m=+0.033697068 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 25 06:32:18 compute-0 nova_compute[186241]: 2025-11-25 06:32:18.364 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:22 compute-0 podman[217773]: 2025-11-25 06:32:22.065086181 +0000 UTC m=+0.044804668 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Nov 25 06:32:22 compute-0 nova_compute[186241]: 2025-11-25 06:32:22.107 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:23 compute-0 nova_compute[186241]: 2025-11-25 06:32:23.365 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:25 compute-0 nova_compute[186241]: 2025-11-25 06:32:25.261 186245 INFO nova.compute.manager [None req-b48cb1e9-0692-4bcf-a81f-c080a5d4df5c 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Get console output
Nov 25 06:32:25 compute-0 nova_compute[186241]: 2025-11-25 06:32:25.265 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.693 186245 DEBUG nova.compute.manager [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-changed-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.694 186245 DEBUG nova.compute.manager [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing instance network info cache due to event network-changed-af54200a-3890-4538-8af0-4a157900fd41. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.694 186245 DEBUG oslo_concurrency.lockutils [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.694 186245 DEBUG oslo_concurrency.lockutils [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.694 186245 DEBUG nova.network.neutron [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing network info cache for port af54200a-3890-4538-8af0-4a157900fd41 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.703 186245 DEBUG nova.compute.manager [req-7f36add1-3e8d-4d36-8e92-32085eb9bba9 req-e1874fcf-f8ef-4857-a3c5-23f61e9db382 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-unplugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.703 186245 DEBUG oslo_concurrency.lockutils [req-7f36add1-3e8d-4d36-8e92-32085eb9bba9 req-e1874fcf-f8ef-4857-a3c5-23f61e9db382 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.703 186245 DEBUG oslo_concurrency.lockutils [req-7f36add1-3e8d-4d36-8e92-32085eb9bba9 req-e1874fcf-f8ef-4857-a3c5-23f61e9db382 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.703 186245 DEBUG oslo_concurrency.lockutils [req-7f36add1-3e8d-4d36-8e92-32085eb9bba9 req-e1874fcf-f8ef-4857-a3c5-23f61e9db382 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.703 186245 DEBUG nova.compute.manager [req-7f36add1-3e8d-4d36-8e92-32085eb9bba9 req-e1874fcf-f8ef-4857-a3c5-23f61e9db382 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] No waiting events found dispatching network-vif-unplugged-af54200a-3890-4538-8af0-4a157900fd41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.704 186245 WARNING nova.compute.manager [req-7f36add1-3e8d-4d36-8e92-32085eb9bba9 req-e1874fcf-f8ef-4857-a3c5-23f61e9db382 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received unexpected event network-vif-unplugged-af54200a-3890-4538-8af0-4a157900fd41 for instance with vm_state active and task_state None.
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:26 compute-0 nova_compute[186241]: 2025-11-25 06:32:26.933 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:27 compute-0 nova_compute[186241]: 2025-11-25 06:32:27.108 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:27 compute-0 nova_compute[186241]: 2025-11-25 06:32:27.727 186245 INFO nova.compute.manager [None req-f9574e6f-48d1-4068-ab5f-7d074d6c3dc3 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Get console output
Nov 25 06:32:27 compute-0 nova_compute[186241]: 2025-11-25 06:32:27.731 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:32:28 compute-0 nova_compute[186241]: 2025-11-25 06:32:28.368 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:28 compute-0 nova_compute[186241]: 2025-11-25 06:32:28.857 186245 DEBUG nova.compute.manager [req-dcfa4fc1-ee59-44b4-9451-e05603bbadef req-931a8414-1d96-47b3-b6f8-e9673d3682ca a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:28 compute-0 nova_compute[186241]: 2025-11-25 06:32:28.857 186245 DEBUG oslo_concurrency.lockutils [req-dcfa4fc1-ee59-44b4-9451-e05603bbadef req-931a8414-1d96-47b3-b6f8-e9673d3682ca a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:28 compute-0 nova_compute[186241]: 2025-11-25 06:32:28.858 186245 DEBUG oslo_concurrency.lockutils [req-dcfa4fc1-ee59-44b4-9451-e05603bbadef req-931a8414-1d96-47b3-b6f8-e9673d3682ca a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:28 compute-0 nova_compute[186241]: 2025-11-25 06:32:28.858 186245 DEBUG oslo_concurrency.lockutils [req-dcfa4fc1-ee59-44b4-9451-e05603bbadef req-931a8414-1d96-47b3-b6f8-e9673d3682ca a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:28 compute-0 nova_compute[186241]: 2025-11-25 06:32:28.858 186245 DEBUG nova.compute.manager [req-dcfa4fc1-ee59-44b4-9451-e05603bbadef req-931a8414-1d96-47b3-b6f8-e9673d3682ca a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] No waiting events found dispatching network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:28 compute-0 nova_compute[186241]: 2025-11-25 06:32:28.858 186245 WARNING nova.compute.manager [req-dcfa4fc1-ee59-44b4-9451-e05603bbadef req-931a8414-1d96-47b3-b6f8-e9673d3682ca a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received unexpected event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 for instance with vm_state active and task_state None.
Nov 25 06:32:29 compute-0 podman[217791]: 2025-11-25 06:32:29.06290743 +0000 UTC m=+0.042882657 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, io.buildah.version=1.41.3)
Nov 25 06:32:29 compute-0 nova_compute[186241]: 2025-11-25 06:32:29.931 186245 DEBUG nova.compute.manager [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-changed-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:29 compute-0 nova_compute[186241]: 2025-11-25 06:32:29.932 186245 DEBUG nova.compute.manager [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing instance network info cache due to event network-changed-af54200a-3890-4538-8af0-4a157900fd41. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:32:29 compute-0 nova_compute[186241]: 2025-11-25 06:32:29.932 186245 DEBUG oslo_concurrency.lockutils [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:32:30 compute-0 nova_compute[186241]: 2025-11-25 06:32:30.338 186245 INFO nova.compute.manager [None req-eef34027-e506-4f0d-b939-61b73cf16c5f 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Get console output
Nov 25 06:32:30 compute-0 nova_compute[186241]: 2025-11-25 06:32:30.341 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:32:30 compute-0 nova_compute[186241]: 2025-11-25 06:32:30.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:30 compute-0 nova_compute[186241]: 2025-11-25 06:32:30.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.026 186245 DEBUG nova.compute.manager [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.026 186245 DEBUG oslo_concurrency.lockutils [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.027 186245 DEBUG oslo_concurrency.lockutils [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.027 186245 DEBUG oslo_concurrency.lockutils [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.027 186245 DEBUG nova.compute.manager [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] No waiting events found dispatching network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.027 186245 WARNING nova.compute.manager [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received unexpected event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 for instance with vm_state active and task_state None.
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.027 186245 DEBUG nova.compute.manager [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.027 186245 DEBUG oslo_concurrency.lockutils [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.028 186245 DEBUG oslo_concurrency.lockutils [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.028 186245 DEBUG oslo_concurrency.lockutils [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.028 186245 DEBUG nova.compute.manager [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] No waiting events found dispatching network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.028 186245 WARNING nova.compute.manager [req-6c12dafb-9f5c-404e-ad64-cdcaad1b6a07 req-b015ba1b-cd8f-415f-aded-1f933ebe36e5 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received unexpected event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 for instance with vm_state active and task_state None.
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.341 186245 DEBUG nova.network.neutron [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updated VIF entry in instance network info cache for port af54200a-3890-4538-8af0-4a157900fd41. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.342 186245 DEBUG nova.network.neutron [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.448 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.449 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.449 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.449 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:32:31 compute-0 podman[217809]: 2025-11-25 06:32:31.511997871 +0000 UTC m=+0.039042534 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.845 186245 DEBUG oslo_concurrency.lockutils [req-8907fcab-84e8-4385-89d7-49214d958c3d req-5c3b1898-bb64-4827-afc5-9251dedad6b1 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.846 186245 DEBUG oslo_concurrency.lockutils [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:32:31 compute-0 nova_compute[186241]: 2025-11-25 06:32:31.846 186245 DEBUG nova.network.neutron [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing network info cache for port af54200a-3890-4538-8af0-4a157900fd41 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.110 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.247 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.247 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.247 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.247 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.248 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.248 186245 INFO nova.compute.manager [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Terminating instance
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.477 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.523 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.524 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.566 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.570 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.624 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.625 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.668 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.751 186245 DEBUG nova.compute.manager [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:32:32 compute-0 kernel: tap4b069377-07 (unregistering): left promiscuous mode
Nov 25 06:32:32 compute-0 NetworkManager[55345]: <info>  [1764052352.7724] device (tap4b069377-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:32:32 compute-0 ovn_controller[95135]: 2025-11-25T06:32:32Z|00163|binding|INFO|Releasing lport 4b069377-07a8-4b40-a297-6a1002cb49d0 from this chassis (sb_readonly=0)
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.777 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:32 compute-0 ovn_controller[95135]: 2025-11-25T06:32:32Z|00164|binding|INFO|Setting lport 4b069377-07a8-4b40-a297-6a1002cb49d0 down in Southbound
Nov 25 06:32:32 compute-0 ovn_controller[95135]: 2025-11-25T06:32:32Z|00165|binding|INFO|Removing iface tap4b069377-07 ovn-installed in OVS
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.782 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:aa:1e 10.100.0.14'], port_security=['fa:16:3e:7d:aa:1e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'afacab74-90bc-4c94-9989-57a24bca630d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f4be2d4a-1533-46ed-87bc-ac265b9010e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=203ab58e-73f0-45ec-9572-3acf3a7b4768, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=4b069377-07a8-4b40-a297-6a1002cb49d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.783 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 4b069377-07a8-4b40-a297-6a1002cb49d0 in datapath 949ef554-1519-45e1-97c2-6c679a7a80e3 unbound from our chassis
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.784 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 949ef554-1519-45e1-97c2-6c679a7a80e3
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.793 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.797 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[fecf2695-8d11-474a-b6c5-d0305c9e8f1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:32 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 25 06:32:32 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 11.404s CPU time.
Nov 25 06:32:32 compute-0 systemd-machined[152921]: Machine qemu-12-instance-0000000c terminated.
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.821 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[2287f51e-1e8c-4be8-b108-7ee28bebdd87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.823 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd0057c-0922-49c7-9684-a642e4f6cb43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.843 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[71cf9fba-245a-4dfa-a073-9a227a26bf16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.861 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[81be892d-ce22-4f0d-bcd4-2c69c60eff2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap949ef554-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:9d:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327226, 'reachable_time': 30134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217857, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.874 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8d1cdd-9313-4531-a163-6fa7872ddb16]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap949ef554-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327234, 'tstamp': 327234}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217858, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap949ef554-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327236, 'tstamp': 327236}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217858, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.875 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap949ef554-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.876 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.879 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.880 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap949ef554-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.880 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.880 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap949ef554-10, col_values=(('external_ids', {'iface-id': '4da4604e-e99e-4437-9f7c-6c43af2847ac'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.880 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.881 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ba9bef-b56c-437b-8653-484011f62769]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-949ef554-1519-45e1-97c2-6c679a7a80e3\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 949ef554-1519-45e1-97c2-6c679a7a80e3\n') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.943 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.944 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5478MB free_disk=72.95974731445312GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.944 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.944 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.980 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:32:32 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:32.981 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.981 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.990 186245 INFO nova.virt.libvirt.driver [-] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Instance destroyed successfully.
Nov 25 06:32:32 compute-0 nova_compute[186241]: 2025-11-25 06:32:32.990 186245 DEBUG nova.objects.instance [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid afacab74-90bc-4c94-9989-57a24bca630d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.080 186245 DEBUG nova.compute.manager [req-5072e171-cee6-4464-bbe1-54d244b9e4b2 req-0f68ed69-7523-440e-adf3-ecd2b7be81c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-vif-unplugged-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.080 186245 DEBUG oslo_concurrency.lockutils [req-5072e171-cee6-4464-bbe1-54d244b9e4b2 req-0f68ed69-7523-440e-adf3-ecd2b7be81c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.080 186245 DEBUG oslo_concurrency.lockutils [req-5072e171-cee6-4464-bbe1-54d244b9e4b2 req-0f68ed69-7523-440e-adf3-ecd2b7be81c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.081 186245 DEBUG oslo_concurrency.lockutils [req-5072e171-cee6-4464-bbe1-54d244b9e4b2 req-0f68ed69-7523-440e-adf3-ecd2b7be81c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.081 186245 DEBUG nova.compute.manager [req-5072e171-cee6-4464-bbe1-54d244b9e4b2 req-0f68ed69-7523-440e-adf3-ecd2b7be81c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] No waiting events found dispatching network-vif-unplugged-4b069377-07a8-4b40-a297-6a1002cb49d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.081 186245 DEBUG nova.compute.manager [req-5072e171-cee6-4464-bbe1-54d244b9e4b2 req-0f68ed69-7523-440e-adf3-ecd2b7be81c8 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-vif-unplugged-4b069377-07a8-4b40-a297-6a1002cb49d0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.214 186245 DEBUG nova.compute.manager [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-changed-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.214 186245 DEBUG nova.compute.manager [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Refreshing instance network info cache due to event network-changed-4b069377-07a8-4b40-a297-6a1002cb49d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.214 186245 DEBUG oslo_concurrency.lockutils [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.214 186245 DEBUG oslo_concurrency.lockutils [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.214 186245 DEBUG nova.network.neutron [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Refreshing network info cache for port 4b069377-07a8-4b40-a297-6a1002cb49d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.369 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.493 186245 DEBUG nova.virt.libvirt.vif [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:31:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-646815711',display_name='tempest-TestNetworkBasicOps-server-646815711',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-646815711',id=12,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIw+NJqacZin6vyCp8LqtKU0xN0GX9tRvX6CN4wtS5FXFTz8F9RxRd3usS1v0JTZDi+00cXkfPnzlD8TexQGqGFEfiIfSkTFaJPyRAGOO4rKmljfz0Is5i/68e0r318Nsg==',key_name='tempest-TestNetworkBasicOps-1173617577',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:32:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-on805j2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:32:03Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=afacab74-90bc-4c94-9989-57a24bca630d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.493 186245 DEBUG nova.network.os_vif_util [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.494 186245 DEBUG nova.network.os_vif_util [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:aa:1e,bridge_name='br-int',has_traffic_filtering=True,id=4b069377-07a8-4b40-a297-6a1002cb49d0,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b069377-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.494 186245 DEBUG os_vif [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:aa:1e,bridge_name='br-int',has_traffic_filtering=True,id=4b069377-07a8-4b40-a297-6a1002cb49d0,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b069377-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.495 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.495 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b069377-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.498 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.498 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.498 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1920e5c6-7ade-424c-a211-5261889c4a45) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.499 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.501 186245 INFO os_vif [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:aa:1e,bridge_name='br-int',has_traffic_filtering=True,id=4b069377-07a8-4b40-a297-6a1002cb49d0,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b069377-07')
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.501 186245 INFO nova.virt.libvirt.driver [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Deleting instance files /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d_del
Nov 25 06:32:33 compute-0 nova_compute[186241]: 2025-11-25 06:32:33.502 186245 INFO nova.virt.libvirt.driver [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Deletion of /var/lib/nova/instances/afacab74-90bc-4c94-9989-57a24bca630d_del complete
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.008 186245 INFO nova.compute.manager [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.009 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.009 186245 DEBUG nova.compute.manager [-] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.009 186245 DEBUG nova.network.neutron [-] [instance: afacab74-90bc-4c94-9989-57a24bca630d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.136 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 423a1897-c822-497b-a9e5-9127b1ec1b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.136 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance afacab74-90bc-4c94-9989-57a24bca630d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.136 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.136 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.184 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:32:34 compute-0 nova_compute[186241]: 2025-11-25 06:32:34.688 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.193 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.193 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.339 186245 DEBUG nova.compute.manager [req-905286c1-a88a-4317-8e9b-0eb6582d791b req-bc213671-72ba-4c93-9ff3-b7db72cf8f73 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.339 186245 DEBUG oslo_concurrency.lockutils [req-905286c1-a88a-4317-8e9b-0eb6582d791b req-bc213671-72ba-4c93-9ff3-b7db72cf8f73 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "afacab74-90bc-4c94-9989-57a24bca630d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.340 186245 DEBUG oslo_concurrency.lockutils [req-905286c1-a88a-4317-8e9b-0eb6582d791b req-bc213671-72ba-4c93-9ff3-b7db72cf8f73 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.340 186245 DEBUG oslo_concurrency.lockutils [req-905286c1-a88a-4317-8e9b-0eb6582d791b req-bc213671-72ba-4c93-9ff3-b7db72cf8f73 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.340 186245 DEBUG nova.compute.manager [req-905286c1-a88a-4317-8e9b-0eb6582d791b req-bc213671-72ba-4c93-9ff3-b7db72cf8f73 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] No waiting events found dispatching network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:35 compute-0 nova_compute[186241]: 2025-11-25 06:32:35.340 186245 WARNING nova.compute.manager [req-905286c1-a88a-4317-8e9b-0eb6582d791b req-bc213671-72ba-4c93-9ff3-b7db72cf8f73 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received unexpected event network-vif-plugged-4b069377-07a8-4b40-a297-6a1002cb49d0 for instance with vm_state active and task_state deleting.
Nov 25 06:32:36 compute-0 nova_compute[186241]: 2025-11-25 06:32:36.194 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:36 compute-0 nova_compute[186241]: 2025-11-25 06:32:36.194 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:36 compute-0 nova_compute[186241]: 2025-11-25 06:32:36.194 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:32:36 compute-0 nova_compute[186241]: 2025-11-25 06:32:36.194 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.112 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.336 186245 DEBUG nova.network.neutron [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updated VIF entry in instance network info cache for port af54200a-3890-4538-8af0-4a157900fd41. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.337 186245 DEBUG nova.network.neutron [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.517 186245 DEBUG nova.compute.manager [req-a0ff638b-9aa8-4383-8bce-d39fe2ba8bba req-10f6aaee-933c-4dbb-9357-045e724c5b27 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Received event network-vif-deleted-4b069377-07a8-4b40-a297-6a1002cb49d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.517 186245 INFO nova.compute.manager [req-a0ff638b-9aa8-4383-8bce-d39fe2ba8bba req-10f6aaee-933c-4dbb-9357-045e724c5b27 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Neutron deleted interface 4b069377-07a8-4b40-a297-6a1002cb49d0; detaching it from the instance and deleting it from the info cache
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.517 186245 DEBUG nova.network.neutron [req-a0ff638b-9aa8-4383-8bce-d39fe2ba8bba req-10f6aaee-933c-4dbb-9357-045e724c5b27 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.840 186245 DEBUG oslo_concurrency.lockutils [req-1a6d8385-582b-4258-82d6-f07ae0a6292e req-450f1a07-3637-4be1-ab13-38b5f178e860 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:32:37 compute-0 nova_compute[186241]: 2025-11-25 06:32:37.867 186245 DEBUG nova.network.neutron [-] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:38 compute-0 nova_compute[186241]: 2025-11-25 06:32:38.021 186245 DEBUG nova.compute.manager [req-a0ff638b-9aa8-4383-8bce-d39fe2ba8bba req-10f6aaee-933c-4dbb-9357-045e724c5b27 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Detach interface failed, port_id=4b069377-07a8-4b40-a297-6a1002cb49d0, reason: Instance afacab74-90bc-4c94-9989-57a24bca630d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:32:38 compute-0 nova_compute[186241]: 2025-11-25 06:32:38.369 186245 INFO nova.compute.manager [-] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Took 4.36 seconds to deallocate network for instance.
Nov 25 06:32:38 compute-0 nova_compute[186241]: 2025-11-25 06:32:38.499 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:38 compute-0 nova_compute[186241]: 2025-11-25 06:32:38.874 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:38 compute-0 nova_compute[186241]: 2025-11-25 06:32:38.875 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:38 compute-0 nova_compute[186241]: 2025-11-25 06:32:38.921 186245 DEBUG nova.compute.provider_tree [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:32:39 compute-0 nova_compute[186241]: 2025-11-25 06:32:39.331 186245 DEBUG nova.network.neutron [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updated VIF entry in instance network info cache for port 4b069377-07a8-4b40-a297-6a1002cb49d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:32:39 compute-0 nova_compute[186241]: 2025-11-25 06:32:39.331 186245 DEBUG nova.network.neutron [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: afacab74-90bc-4c94-9989-57a24bca630d] Updating instance_info_cache with network_info: [{"id": "4b069377-07a8-4b40-a297-6a1002cb49d0", "address": "fa:16:3e:7d:aa:1e", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b069377-07", "ovs_interfaceid": "4b069377-07a8-4b40-a297-6a1002cb49d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:39 compute-0 nova_compute[186241]: 2025-11-25 06:32:39.426 186245 DEBUG nova.scheduler.client.report [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:32:39 compute-0 nova_compute[186241]: 2025-11-25 06:32:39.834 186245 DEBUG oslo_concurrency.lockutils [req-1d28659d-0cd9-4fee-8bbb-6d5589dd648c req-3e48bf4b-5303-4132-bfa6-43f27ffc34fd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-afacab74-90bc-4c94-9989-57a24bca630d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:32:39 compute-0 nova_compute[186241]: 2025-11-25 06:32:39.932 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:39 compute-0 nova_compute[186241]: 2025-11-25 06:32:39.953 186245 INFO nova.scheduler.client.report [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance afacab74-90bc-4c94-9989-57a24bca630d
Nov 25 06:32:40 compute-0 nova_compute[186241]: 2025-11-25 06:32:40.962 186245 DEBUG oslo_concurrency.lockutils [None req-ea0276fd-c9ca-412f-86c7-84ac475e86b9 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "afacab74-90bc-4c94-9989-57a24bca630d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:40 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:40.981 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:41 compute-0 podman[217876]: 2025-11-25 06:32:41.078620509 +0000 UTC m=+0.054978908 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:32:42 compute-0 nova_compute[186241]: 2025-11-25 06:32:42.112 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:42 compute-0 nova_compute[186241]: 2025-11-25 06:32:42.568 186245 DEBUG nova.compute.manager [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-changed-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:42 compute-0 nova_compute[186241]: 2025-11-25 06:32:42.568 186245 DEBUG nova.compute.manager [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing instance network info cache due to event network-changed-af54200a-3890-4538-8af0-4a157900fd41. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:32:42 compute-0 nova_compute[186241]: 2025-11-25 06:32:42.569 186245 DEBUG oslo_concurrency.lockutils [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:32:42 compute-0 nova_compute[186241]: 2025-11-25 06:32:42.569 186245 DEBUG oslo_concurrency.lockutils [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:32:42 compute-0 nova_compute[186241]: 2025-11-25 06:32:42.569 186245 DEBUG nova.network.neutron [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Refreshing network info cache for port af54200a-3890-4538-8af0-4a157900fd41 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.108 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.109 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.109 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.109 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.109 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.110 186245 INFO nova.compute.manager [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Terminating instance
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.501 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.614 186245 DEBUG nova.compute.manager [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:32:43 compute-0 kernel: tapaf54200a-38 (unregistering): left promiscuous mode
Nov 25 06:32:43 compute-0 NetworkManager[55345]: <info>  [1764052363.6368] device (tapaf54200a-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:32:43 compute-0 ovn_controller[95135]: 2025-11-25T06:32:43Z|00166|binding|INFO|Releasing lport af54200a-3890-4538-8af0-4a157900fd41 from this chassis (sb_readonly=0)
Nov 25 06:32:43 compute-0 ovn_controller[95135]: 2025-11-25T06:32:43Z|00167|binding|INFO|Setting lport af54200a-3890-4538-8af0-4a157900fd41 down in Southbound
Nov 25 06:32:43 compute-0 ovn_controller[95135]: 2025-11-25T06:32:43Z|00168|binding|INFO|Removing iface tapaf54200a-38 ovn-installed in OVS
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.641 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.653 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:b2:36 10.100.0.4'], port_security=['fa:16:3e:15:b2:36 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '423a1897-c822-497b-a9e5-9127b1ec1b38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-949ef554-1519-45e1-97c2-6c679a7a80e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '9', 'neutron:security_group_ids': '459b41c6-f1b9-460f-94da-cfc72f07e425', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=203ab58e-73f0-45ec-9572-3acf3a7b4768, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=af54200a-3890-4538-8af0-4a157900fd41) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.655 103953 INFO neutron.agent.ovn.metadata.agent [-] Port af54200a-3890-4538-8af0-4a157900fd41 in datapath 949ef554-1519-45e1-97c2-6c679a7a80e3 unbound from our chassis
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.656 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 949ef554-1519-45e1-97c2-6c679a7a80e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.656 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c60bd1a5-3131-4e15-b23f-e0754f5c2048]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.657 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3 namespace which is not needed anymore
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.661 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 25 06:32:43 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 12.420s CPU time.
Nov 25 06:32:43 compute-0 systemd-machined[152921]: Machine qemu-11-instance-0000000b terminated.
Nov 25 06:32:43 compute-0 neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3[217414]: [NOTICE]   (217418) : haproxy version is 2.8.14-c23fe91
Nov 25 06:32:43 compute-0 neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3[217414]: [NOTICE]   (217418) : path to executable is /usr/sbin/haproxy
Nov 25 06:32:43 compute-0 neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3[217414]: [WARNING]  (217418) : Exiting Master process...
Nov 25 06:32:43 compute-0 neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3[217414]: [ALERT]    (217418) : Current worker (217420) exited with code 143 (Terminated)
Nov 25 06:32:43 compute-0 neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3[217414]: [WARNING]  (217418) : All workers exited. Exiting... (0)
Nov 25 06:32:43 compute-0 podman[217922]: 2025-11-25 06:32:43.738430048 +0000 UTC m=+0.022173964 container kill 8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:32:43 compute-0 systemd[1]: libpod-8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09.scope: Deactivated successfully.
Nov 25 06:32:43 compute-0 podman[217935]: 2025-11-25 06:32:43.770287979 +0000 UTC m=+0.018434260 container died 8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:32:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09-userdata-shm.mount: Deactivated successfully.
Nov 25 06:32:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-9b2718eaaed4fd368860f42b56c4a0b13ba83498a81b1c01205bede26071e013-merged.mount: Deactivated successfully.
Nov 25 06:32:43 compute-0 podman[217935]: 2025-11-25 06:32:43.789251936 +0000 UTC m=+0.037398217 container cleanup 8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:32:43 compute-0 systemd[1]: libpod-conmon-8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09.scope: Deactivated successfully.
Nov 25 06:32:43 compute-0 podman[217936]: 2025-11-25 06:32:43.79679436 +0000 UTC m=+0.041041298 container remove 8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.800 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[25ab0218-1635-420c-b646-6189546774c6]: (4, ("Tue Nov 25 06:32:43 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3 (8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09)\n8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09\nTue Nov 25 06:32:43 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3 (8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09)\n8fe7e43474e845bd4229a6f0126835be5f35264c3c075bdf08606d29c45b7b09\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.801 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b26e1d-b0b1-4432-8ba4-ad12c75209c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.801 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/949ef554-1519-45e1-97c2-6c679a7a80e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.802 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[c952eb0e-e24c-4309-bcaa-c5e63d045f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.803 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap949ef554-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:43 compute-0 kernel: tap949ef554-10: left promiscuous mode
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.804 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.807 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.809 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7b03bbe8-20d7-407a-8fcf-e5db6f5afcbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.822 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 NetworkManager[55345]: <error> [1764052363.8264] platform-linux: error reading net:/sys/class/net/tapaf54200a-38/dev_id: error reading 4096 bytes from file descriptor: Invalid argument
Nov 25 06:32:43 compute-0 NetworkManager[55345]: <info>  [1764052363.8268] manager: (tapaf54200a-38): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.827 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.829 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[64b06ab3-458d-4949-97c4-e4d4ff7766c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.830 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[2aec9647-e7e1-4465-8d5d-f4476e1c89b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.833 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.844 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2e6cd1-65bc-43ab-96cb-bed1e6484f71]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327221, 'reachable_time': 35569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217969, 'error': None, 'target': 'ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.846 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-949ef554-1519-45e1-97c2-6c679a7a80e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:32:43 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:43.846 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[8b33b2fb-1d2c-4ed4-9c62-de70c9396265]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:32:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d949ef554\x2d1519\x2d45e1\x2d97c2\x2d6c679a7a80e3.mount: Deactivated successfully.
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.857 186245 INFO nova.virt.libvirt.driver [-] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Instance destroyed successfully.
Nov 25 06:32:43 compute-0 nova_compute[186241]: 2025-11-25 06:32:43.858 186245 DEBUG nova.objects.instance [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 423a1897-c822-497b-a9e5-9127b1ec1b38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.361 186245 DEBUG nova.virt.libvirt.vif [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:31:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1463597864',display_name='tempest-TestNetworkBasicOps-server-1463597864',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1463597864',id=11,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB9YZGtnzOrlYGmISpA+unwY4gONLlAjFqumwC23Q7+Crw5i+LqLcVlVt/0sCZ6d5eQBauJ/jx+crkSOk9SVpdrxjSZtY8aMCmxwkmfsK+tTsRzoWni06YjvZ9ACgoxHlw==',key_name='tempest-TestNetworkBasicOps-1642248967',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:31:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-4krqvcv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:31:25Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=423a1897-c822-497b-a9e5-9127b1ec1b38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.361 186245 DEBUG nova.network.os_vif_util [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.362 186245 DEBUG nova.network.os_vif_util [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=af54200a-3890-4538-8af0-4a157900fd41,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf54200a-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.362 186245 DEBUG os_vif [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=af54200a-3890-4538-8af0-4a157900fd41,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf54200a-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.363 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.363 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf54200a-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.364 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.366 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.367 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.367 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=97cf69fd-301b-4d81-8cd4-3cfa528ee806) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.367 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.368 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.370 186245 INFO os_vif [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=af54200a-3890-4538-8af0-4a157900fd41,network=Network(949ef554-1519-45e1-97c2-6c679a7a80e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf54200a-38')
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.370 186245 INFO nova.virt.libvirt.driver [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Deleting instance files /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38_del
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.370 186245 INFO nova.virt.libvirt.driver [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Deletion of /var/lib/nova/instances/423a1897-c822-497b-a9e5-9127b1ec1b38_del complete
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.533 186245 DEBUG nova.compute.manager [req-17441432-02fc-4eee-8fc2-f3e01dc9d353 req-8f5e7254-8596-467d-9bc8-833b815a81f9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-unplugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.533 186245 DEBUG oslo_concurrency.lockutils [req-17441432-02fc-4eee-8fc2-f3e01dc9d353 req-8f5e7254-8596-467d-9bc8-833b815a81f9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.533 186245 DEBUG oslo_concurrency.lockutils [req-17441432-02fc-4eee-8fc2-f3e01dc9d353 req-8f5e7254-8596-467d-9bc8-833b815a81f9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.533 186245 DEBUG oslo_concurrency.lockutils [req-17441432-02fc-4eee-8fc2-f3e01dc9d353 req-8f5e7254-8596-467d-9bc8-833b815a81f9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.533 186245 DEBUG nova.compute.manager [req-17441432-02fc-4eee-8fc2-f3e01dc9d353 req-8f5e7254-8596-467d-9bc8-833b815a81f9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] No waiting events found dispatching network-vif-unplugged-af54200a-3890-4538-8af0-4a157900fd41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.533 186245 DEBUG nova.compute.manager [req-17441432-02fc-4eee-8fc2-f3e01dc9d353 req-8f5e7254-8596-467d-9bc8-833b815a81f9 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-unplugged-af54200a-3890-4538-8af0-4a157900fd41 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.878 186245 INFO nova.compute.manager [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.878 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.878 186245 DEBUG nova.compute.manager [-] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:32:44 compute-0 nova_compute[186241]: 2025-11-25 06:32:44.879 186245 DEBUG nova.network.neutron [-] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:32:45 compute-0 podman[217979]: 2025-11-25 06:32:45.066936614 +0000 UTC m=+0.042385368 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:32:45 compute-0 podman[217978]: 2025-11-25 06:32:45.06903192 +0000 UTC m=+0.046080148 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=multipathd)
Nov 25 06:32:45 compute-0 nova_compute[186241]: 2025-11-25 06:32:45.830 186245 DEBUG nova.compute.manager [req-ad0bf281-1206-4877-9568-d5df236e4383 req-4a574634-8c4f-49cd-9a0b-6b69401ea344 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-deleted-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:45 compute-0 nova_compute[186241]: 2025-11-25 06:32:45.831 186245 INFO nova.compute.manager [req-ad0bf281-1206-4877-9568-d5df236e4383 req-4a574634-8c4f-49cd-9a0b-6b69401ea344 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Neutron deleted interface af54200a-3890-4538-8af0-4a157900fd41; detaching it from the instance and deleting it from the info cache
Nov 25 06:32:45 compute-0 nova_compute[186241]: 2025-11-25 06:32:45.832 186245 DEBUG nova.network.neutron [req-ad0bf281-1206-4877-9568-d5df236e4383 req-4a574634-8c4f-49cd-9a0b-6b69401ea344 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.168 186245 DEBUG nova.network.neutron [-] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.319 186245 DEBUG nova.network.neutron [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updated VIF entry in instance network info cache for port af54200a-3890-4538-8af0-4a157900fd41. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.320 186245 DEBUG nova.network.neutron [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Updating instance_info_cache with network_info: [{"id": "af54200a-3890-4538-8af0-4a157900fd41", "address": "fa:16:3e:15:b2:36", "network": {"id": "949ef554-1519-45e1-97c2-6c679a7a80e3", "bridge": "br-int", "label": "tempest-network-smoke--2111457677", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf54200a-38", "ovs_interfaceid": "af54200a-3890-4538-8af0-4a157900fd41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.336 186245 DEBUG nova.compute.manager [req-ad0bf281-1206-4877-9568-d5df236e4383 req-4a574634-8c4f-49cd-9a0b-6b69401ea344 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Detach interface failed, port_id=af54200a-3890-4538-8af0-4a157900fd41, reason: Instance 423a1897-c822-497b-a9e5-9127b1ec1b38 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.673 186245 INFO nova.compute.manager [-] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Took 1.79 seconds to deallocate network for instance.
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.723 186245 DEBUG nova.compute.manager [req-4cc49965-649d-4c21-9da8-8f75184a06e8 req-037633ad-0a14-47fd-8643-80c76c9b99cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.723 186245 DEBUG oslo_concurrency.lockutils [req-4cc49965-649d-4c21-9da8-8f75184a06e8 req-037633ad-0a14-47fd-8643-80c76c9b99cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.724 186245 DEBUG oslo_concurrency.lockutils [req-4cc49965-649d-4c21-9da8-8f75184a06e8 req-037633ad-0a14-47fd-8643-80c76c9b99cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.724 186245 DEBUG oslo_concurrency.lockutils [req-4cc49965-649d-4c21-9da8-8f75184a06e8 req-037633ad-0a14-47fd-8643-80c76c9b99cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.724 186245 DEBUG nova.compute.manager [req-4cc49965-649d-4c21-9da8-8f75184a06e8 req-037633ad-0a14-47fd-8643-80c76c9b99cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] No waiting events found dispatching network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.725 186245 WARNING nova.compute.manager [req-4cc49965-649d-4c21-9da8-8f75184a06e8 req-037633ad-0a14-47fd-8643-80c76c9b99cd a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 423a1897-c822-497b-a9e5-9127b1ec1b38] Received unexpected event network-vif-plugged-af54200a-3890-4538-8af0-4a157900fd41 for instance with vm_state deleted and task_state None.
Nov 25 06:32:46 compute-0 nova_compute[186241]: 2025-11-25 06:32:46.824 186245 DEBUG oslo_concurrency.lockutils [req-319a3832-e582-4e43-ad81-798734f06a20 req-def53a5f-7715-47fc-b686-935562942851 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-423a1897-c822-497b-a9e5-9127b1ec1b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:32:47 compute-0 nova_compute[186241]: 2025-11-25 06:32:47.115 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:47 compute-0 nova_compute[186241]: 2025-11-25 06:32:47.179 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:47 compute-0 nova_compute[186241]: 2025-11-25 06:32:47.179 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:47 compute-0 nova_compute[186241]: 2025-11-25 06:32:47.226 186245 DEBUG nova.compute.provider_tree [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:32:47 compute-0 nova_compute[186241]: 2025-11-25 06:32:47.730 186245 DEBUG nova.scheduler.client.report [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:32:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:47.764 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:32:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:47.764 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:32:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:32:47.764 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:48 compute-0 nova_compute[186241]: 2025-11-25 06:32:48.235 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:48 compute-0 nova_compute[186241]: 2025-11-25 06:32:48.255 186245 INFO nova.scheduler.client.report [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 423a1897-c822-497b-a9e5-9127b1ec1b38
Nov 25 06:32:49 compute-0 podman[218018]: 2025-11-25 06:32:49.054882581 +0000 UTC m=+0.033597578 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:32:49 compute-0 nova_compute[186241]: 2025-11-25 06:32:49.265 186245 DEBUG oslo_concurrency.lockutils [None req-29618f02-615f-47ac-91ea-72540d41faff 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "423a1897-c822-497b-a9e5-9127b1ec1b38" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:32:49 compute-0 nova_compute[186241]: 2025-11-25 06:32:49.368 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:52 compute-0 nova_compute[186241]: 2025-11-25 06:32:52.116 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:53 compute-0 podman[218034]: 2025-11-25 06:32:53.056003333 +0000 UTC m=+0.036843403 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Nov 25 06:32:54 compute-0 nova_compute[186241]: 2025-11-25 06:32:54.369 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:55 compute-0 nova_compute[186241]: 2025-11-25 06:32:55.495 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:55 compute-0 nova_compute[186241]: 2025-11-25 06:32:55.571 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:57 compute-0 nova_compute[186241]: 2025-11-25 06:32:57.117 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:32:59 compute-0 nova_compute[186241]: 2025-11-25 06:32:59.370 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:00 compute-0 podman[218053]: 2025-11-25 06:33:00.062104309 +0000 UTC m=+0.041446272 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 25 06:33:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:00.643 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:c9:d7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71299229-251b-49dd-8c1a-540be467fcd4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=52ffae52-0232-48cd-ba01-ca45b9f24734) old=Port_Binding(mac=['fa:16:3e:96:c9:d7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:33:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:00.643 103953 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 52ffae52-0232-48cd-ba01-ca45b9f24734 in datapath d7ca12ab-7c69-415e-be0d-20e16fbabaec updated
Nov 25 06:33:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:00.644 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7ca12ab-7c69-415e-be0d-20e16fbabaec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:33:00 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:00.644 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[6ecc6ee1-3aa4-4c89-806f-b6246e4cb5b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:02 compute-0 podman[218070]: 2025-11-25 06:33:02.059963784 +0000 UTC m=+0.038792074 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:33:02 compute-0 nova_compute[186241]: 2025-11-25 06:33:02.118 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:04 compute-0 nova_compute[186241]: 2025-11-25 06:33:04.372 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:07 compute-0 nova_compute[186241]: 2025-11-25 06:33:07.119 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:09 compute-0 nova_compute[186241]: 2025-11-25 06:33:09.373 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:11 compute-0 nova_compute[186241]: 2025-11-25 06:33:11.562 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:11 compute-0 nova_compute[186241]: 2025-11-25 06:33:11.563 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:12 compute-0 nova_compute[186241]: 2025-11-25 06:33:12.065 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2439
Nov 25 06:33:12 compute-0 podman[218091]: 2025-11-25 06:33:12.077190415 +0000 UTC m=+0.053458484 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 06:33:12 compute-0 nova_compute[186241]: 2025-11-25 06:33:12.120 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:12 compute-0 nova_compute[186241]: 2025-11-25 06:33:12.596 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:12 compute-0 nova_compute[186241]: 2025-11-25 06:33:12.597 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:12 compute-0 nova_compute[186241]: 2025-11-25 06:33:12.602 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2468
Nov 25 06:33:12 compute-0 nova_compute[186241]: 2025-11-25 06:33:12.603 186245 INFO nova.compute.claims [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Claim successful on node compute-0.ctlplane.example.com
Nov 25 06:33:13 compute-0 nova_compute[186241]: 2025-11-25 06:33:13.644 186245 DEBUG nova.compute.provider_tree [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:33:14 compute-0 nova_compute[186241]: 2025-11-25 06:33:14.148 186245 DEBUG nova.scheduler.client.report [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:33:14 compute-0 nova_compute[186241]: 2025-11-25 06:33:14.375 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:14 compute-0 nova_compute[186241]: 2025-11-25 06:33:14.652 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:14 compute-0 nova_compute[186241]: 2025-11-25 06:33:14.653 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2836
Nov 25 06:33:15 compute-0 nova_compute[186241]: 2025-11-25 06:33:15.158 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1988
Nov 25 06:33:15 compute-0 nova_compute[186241]: 2025-11-25 06:33:15.158 186245 DEBUG nova.network.neutron [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1205
Nov 25 06:33:15 compute-0 nova_compute[186241]: 2025-11-25 06:33:15.496 186245 DEBUG nova.policy [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66a05d0ca82146a5a458244c8e5364de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:192
Nov 25 06:33:15 compute-0 nova_compute[186241]: 2025-11-25 06:33:15.663 186245 INFO nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 06:33:16 compute-0 podman[218115]: 2025-11-25 06:33:16.060193364 +0000 UTC m=+0.037027889 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:33:16 compute-0 podman[218114]: 2025-11-25 06:33:16.064902543 +0000 UTC m=+0.043771961 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 25 06:33:16 compute-0 nova_compute[186241]: 2025-11-25 06:33:16.167 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2871
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.123 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.178 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2645
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.178 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5185
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.179 186245 INFO nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Creating image(s)
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.179 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "/var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.179 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.180 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "/var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.181 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.183 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.184 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.229 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.229 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.230 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.230 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.233 186245 DEBUG oslo_utils.imageutils.format_inspector [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.9/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.234 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.277 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.278 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.296 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be,backing_fmt=raw /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.297 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.297 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.339 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/25f6d0e3a3cf04553f8f4cefeb1ff388d48a60be --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.340 186245 DEBUG nova.virt.disk.api [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Checking if we can resize image /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:164
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.340 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.384 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.385 186245 DEBUG nova.virt.disk.api [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Cannot resize image /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:170
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.385 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5317
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.385 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Ensure instance console log exists: /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5071
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.386 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.386 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.386 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:17 compute-0 nova_compute[186241]: 2025-11-25 06:33:17.489 186245 DEBUG nova.network.neutron [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Successfully created port: 509a0271-f192-4363-a7a6-2f2ba54791b3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 25 06:33:18 compute-0 nova_compute[186241]: 2025-11-25 06:33:18.330 186245 DEBUG nova.network.neutron [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Successfully updated port: 509a0271-f192-4363-a7a6-2f2ba54791b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 25 06:33:18 compute-0 nova_compute[186241]: 2025-11-25 06:33:18.505 186245 DEBUG nova.compute.manager [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-changed-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:18 compute-0 nova_compute[186241]: 2025-11-25 06:33:18.505 186245 DEBUG nova.compute.manager [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Refreshing instance network info cache due to event network-changed-509a0271-f192-4363-a7a6-2f2ba54791b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:33:18 compute-0 nova_compute[186241]: 2025-11-25 06:33:18.505 186245 DEBUG oslo_concurrency.lockutils [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:33:18 compute-0 nova_compute[186241]: 2025-11-25 06:33:18.506 186245 DEBUG oslo_concurrency.lockutils [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:33:18 compute-0 nova_compute[186241]: 2025-11-25 06:33:18.506 186245 DEBUG nova.network.neutron [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Refreshing network info cache for port 509a0271-f192-4363-a7a6-2f2ba54791b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:33:18 compute-0 nova_compute[186241]: 2025-11-25 06:33:18.834 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:33:19 compute-0 nova_compute[186241]: 2025-11-25 06:33:19.335 186245 DEBUG nova.network.neutron [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:33:19 compute-0 nova_compute[186241]: 2025-11-25 06:33:19.377 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:19 compute-0 nova_compute[186241]: 2025-11-25 06:33:19.948 186245 DEBUG nova.network.neutron [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:33:20 compute-0 podman[218167]: 2025-11-25 06:33:20.052866285 +0000 UTC m=+0.032526040 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:33:20 compute-0 nova_compute[186241]: 2025-11-25 06:33:20.452 186245 DEBUG oslo_concurrency.lockutils [req-40b79881-fd97-4fdd-8835-17a4edeb1e0a req-8090910a-74c7-4ca4-8c99-c67204499164 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:33:20 compute-0 nova_compute[186241]: 2025-11-25 06:33:20.453 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquired lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:33:20 compute-0 nova_compute[186241]: 2025-11-25 06:33:20.453 186245 DEBUG nova.network.neutron [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2067
Nov 25 06:33:21 compute-0 nova_compute[186241]: 2025-11-25 06:33:21.343 186245 DEBUG nova.network.neutron [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3380
Nov 25 06:33:22 compute-0 nova_compute[186241]: 2025-11-25 06:33:22.123 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:24 compute-0 podman[218183]: 2025-11-25 06:33:24.058935175 +0000 UTC m=+0.039212996 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 06:33:24 compute-0 nova_compute[186241]: 2025-11-25 06:33:24.379 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.046 186245 DEBUG nova.network.neutron [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updating instance_info_cache with network_info: [{"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.549 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Releasing lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.550 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Instance network_info: |[{"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:2003
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.551 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Start _get_guest_xml network_info=[{"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '5215c26e-be2f-40b4-ac47-476bfa3cf3f2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8041
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.554 186245 WARNING nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.555 186245 DEBUG nova.virt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', instance_meta=NovaInstanceMeta(name='tempest-TestNetworkBasicOps-server-1555126729', uuid='3e3567b4-364e-4663-82fb-6a7b8c296187'), owner=OwnerMeta(userid='66a05d0ca82146a5a458244c8e5364de', username='tempest-TestNetworkBasicOps-1672753768-project-member', projectid='d90b557db9104ecfb816b1cdab8712bd', projectname='tempest-TestNetworkBasicOps-1672753768'), image=ImageMeta(id='5215c26e-be2f-40b4-ac47-476bfa3cf3f2', name=None, properties=ImageMetaProps(hw_architecture=<?>,hw_auto_disk_config=<?>,hw_boot_menu=<?>,hw_cdrom_bus=<?>,hw_cpu_cores=<?>,hw_cpu_max_cores=<?>,hw_cpu_max_sockets=<?>,hw_cpu_max_threads=<?>,hw_cpu_policy=<?>,hw_cpu_realtime_mask=<?>,hw_cpu_sockets=<?>,hw_cpu_thread_policy=<?>,hw_cpu_threads=<?>,hw_device_id=<?>,hw_disk_bus=<?>,hw_disk_type=<?>,hw_emulation_architecture=<?>,hw_ephemeral_encryption=<?>,hw_ephemeral_encryption_format=<?>,hw_ephemeral_encryption_secret_uuid=<?>,hw_firmware_stateless=<?>,hw_firmware_type=<?>,hw_floppy_bus=<?>,hw_input_bus=<?>,hw_ipxe_boot=<?>,hw_locked_memory=<?>,hw_machine_type=<?>,hw_maxphysaddr_bits=<?>,hw_maxphysaddr_mode=<?>,hw_mem_encryption=<?>,hw_mem_page_size=<?>,hw_numa_cpus=<?>,hw_numa_mem=<?>,hw_numa_nodes=<?>,hw_pci_numa_affinity_policy=<?>,hw_pmu=<?>,hw_pointer_model=<?>,hw_qemu_guest_agent=<?>,hw_rescue_bus=<?>,hw_rescue_device=<?>,hw_rng_model='virtio',hw_scsi_model=<?>,hw_serial_port_count=<?>,hw_time_hpet=<?>,hw_tpm_model=<?>,hw_tpm_version=<?>,hw_video_model=<?>,hw_video_ram=<?>,hw_vif_model=<?>,hw_vif_multiqueue_enabled=<?>,hw_viommu_model=<?>,hw_virtio_packed_ring=<?>,hw_vm_mode=<?>,hw_watchdog_action=<?>,img_bdm_v2=<?>,img_bittorrent=<?>,img_block_device_mapping=<?>,img_cache_in_nova=<?>,img_compression_level=<?>,img_config_drive=<?>,img_hide_hypervisor_id=<?>,img_hv_requested_version=<?>,img_hv_type=<?>,img_linked_clone=<?>,img_mappings=<?>,img_owner_id=<?>,img_root_device_name=<?>,img_signature=<?>,img_signature_certificate_uuid=<?>,img_signature_hash_method=<?>,img_signature_key_type=<?>,img_use_agent=<?>,img_version=<?>,os_admin_user=<?>,os_command_line=<?>,os_distro=<?>,os_require_quiesce=<?>,os_secure_boot=<?>,os_skip_agent_inject_files_at_boot=<?>,os_skip_agent_inject_ssh=<?>,os_type=<?>,traits_required=<?>)), flavor=FlavorMeta(name='m1.nano', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='31.1.0-0.20250428102727.3e7017e.el9', creation_time=1764052405.555203) get_instance_driver_metadata /usr/lib/python3.9/site-packages/nova/virt/driver.py:401
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.563 186245 DEBUG nova.virt.libvirt.host [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1695
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.564 186245 DEBUG nova.virt.libvirt.host [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1705
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.567 186245 DEBUG nova.virt.libvirt.host [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1714
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.568 186245 DEBUG nova.virt.libvirt.host [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1721
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.568 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5856
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.568 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T06:18:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='53fe9ba3-b68b-4b49-b6ff-9cbebfa9f9ac',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T06:18:09Z,direct_url=<?>,disk_format='qcow2',id=5215c26e-be2f-40b4-ac47-476bfa3cf3f2,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='569b0ed2b3cc4372897b86d284219992',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T06:18:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:567
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.568 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.569 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:356
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.569 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.569 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:396
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.569 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:434
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.569 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:573
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.569 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:475
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.569 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:505
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.570 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:579
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.570 186245 DEBUG nova.virt.hardware [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:581
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.572 186245 DEBUG nova.virt.libvirt.vif [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:33:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1555126729',display_name='tempest-TestNetworkBasicOps-server-1555126729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1555126729',id=13,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMamTGNu+pgqfT14H5GiPNJTaBKE2C6EeXRosqXkYEtnp6ufmDyaZ6AGyCPn1Jitqb+VSd4DDMgvHLatkEVlQ0PiyP0uXnqra1nP9RkOaF8WOIzDqsi58jateJwjcAxfUA==',key_name='tempest-TestNetworkBasicOps-183630991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-uprckrac',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:33:16Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=3e3567b4-364e-4663-82fb-6a7b8c296187,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.572 186245 DEBUG nova.network.os_vif_util [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.573 186245 DEBUG nova.network.os_vif_util [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:64:2e,bridge_name='br-int',has_traffic_filtering=True,id=509a0271-f192-4363-a7a6-2f2ba54791b3,network=Network(d7ca12ab-7c69-415e-be0d-20e16fbabaec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509a0271-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:33:25 compute-0 nova_compute[186241]: 2025-11-25 06:33:25.574 186245 DEBUG nova.objects.instance [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 3e3567b4-364e-4663-82fb-6a7b8c296187 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.079 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] End _get_guest_xml xml=<domain type="kvm">
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <uuid>3e3567b4-364e-4663-82fb-6a7b8c296187</uuid>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <name>instance-0000000d</name>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <memory>131072</memory>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <vcpu>1</vcpu>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <metadata>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <nova:package version="31.1.0-0.20250428102727.3e7017e.el9"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <nova:name>tempest-TestNetworkBasicOps-server-1555126729</nova:name>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <nova:creationTime>2025-11-25 06:33:25</nova:creationTime>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <nova:flavor name="m1.nano">
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:memory>128</nova:memory>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:disk>1</nova:disk>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:swap>0</nova:swap>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:vcpus>1</nova:vcpus>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       </nova:flavor>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <nova:owner>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:user uuid="66a05d0ca82146a5a458244c8e5364de">tempest-TestNetworkBasicOps-1672753768-project-member</nova:user>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:project uuid="d90b557db9104ecfb816b1cdab8712bd">tempest-TestNetworkBasicOps-1672753768</nova:project>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       </nova:owner>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <nova:root type="image" uuid="5215c26e-be2f-40b4-ac47-476bfa3cf3f2"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <nova:ports>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         <nova:port uuid="509a0271-f192-4363-a7a6-2f2ba54791b3">
Nov 25 06:33:26 compute-0 nova_compute[186241]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:         </nova:port>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       </nova:ports>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </nova:instance>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   </metadata>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <sysinfo type="smbios">
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <system>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <entry name="manufacturer">RDO</entry>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <entry name="product">OpenStack Compute</entry>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <entry name="version">31.1.0-0.20250428102727.3e7017e.el9</entry>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <entry name="serial">3e3567b4-364e-4663-82fb-6a7b8c296187</entry>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <entry name="uuid">3e3567b4-364e-4663-82fb-6a7b8c296187</entry>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <entry name="family">Virtual Machine</entry>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </system>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   </sysinfo>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <os>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <boot dev="hd"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <smbios mode="sysinfo"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   </os>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <features>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <acpi/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <apic/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <vmcoreinfo/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   </features>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <clock offset="utc">
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <timer name="hpet" present="no"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   </clock>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <cpu mode="host-model" match="exact">
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   </cpu>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   <devices>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <disk type="file" device="disk">
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <target dev="vda" bus="virtio"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <disk type="file" device="cdrom">
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <source file="/var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk.config"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <target dev="sda" bus="sata"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </disk>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <interface type="ethernet">
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <mac address="fa:16:3e:7c:64:2e"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <mtu size="1442"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <target dev="tap509a0271-f1"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </interface>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <serial type="pty">
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <log file="/var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/console.log" append="off"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </serial>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <video>
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <model type="virtio"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </video>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <input type="tablet" bus="usb"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <rng model="virtio">
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <backend model="random">/dev/urandom</backend>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </rng>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <controller type="usb" index="0"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     <memballoon model="virtio">
Nov 25 06:33:26 compute-0 nova_compute[186241]:       <stats period="10"/>
Nov 25 06:33:26 compute-0 nova_compute[186241]:     </memballoon>
Nov 25 06:33:26 compute-0 nova_compute[186241]:   </devices>
Nov 25 06:33:26 compute-0 nova_compute[186241]: </domain>
Nov 25 06:33:26 compute-0 nova_compute[186241]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:8047
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.080 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Preparing to wait for external event network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:284
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.080 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.080 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.080 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.080 186245 DEBUG nova.virt.libvirt.vif [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T06:33:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1555126729',display_name='tempest-TestNetworkBasicOps-server-1555126729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1555126729',id=13,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMamTGNu+pgqfT14H5GiPNJTaBKE2C6EeXRosqXkYEtnp6ufmDyaZ6AGyCPn1Jitqb+VSd4DDMgvHLatkEVlQ0PiyP0uXnqra1nP9RkOaF8WOIzDqsi58jateJwjcAxfUA==',key_name='tempest-TestNetworkBasicOps-183630991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-uprckrac',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T06:33:16Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=3e3567b4-364e-4663-82fb-6a7b8c296187,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.081 186245 DEBUG nova.network.os_vif_util [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.081 186245 DEBUG nova.network.os_vif_util [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:64:2e,bridge_name='br-int',has_traffic_filtering=True,id=509a0271-f192-4363-a7a6-2f2ba54791b3,network=Network(d7ca12ab-7c69-415e-be0d-20e16fbabaec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509a0271-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.081 186245 DEBUG os_vif [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:64:2e,bridge_name='br-int',has_traffic_filtering=True,id=509a0271-f192-4363-a7a6-2f2ba54791b3,network=Network(d7ca12ab-7c69-415e-be0d-20e16fbabaec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509a0271-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.082 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.082 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.082 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.083 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.083 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b96b09d3-10b4-5e25-ac8e-2c8512c70acb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.084 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.084 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.086 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.086 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap509a0271-f1, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.087 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap509a0271-f1, col_values=(('qos', UUID('4edcee83-bc8f-4c7a-a56f-59946a9fb971')),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.087 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap509a0271-f1, col_values=(('external_ids', {'iface-id': '509a0271-f192-4363-a7a6-2f2ba54791b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:64:2e', 'vm-uuid': '3e3567b4-364e-4663-82fb-6a7b8c296187'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.088 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:26 compute-0 NetworkManager[55345]: <info>  [1764052406.0893] manager: (tap509a0271-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.089 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.091 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.092 186245 INFO os_vif [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:64:2e,bridge_name='br-int',has_traffic_filtering=True,id=509a0271-f192-4363-a7a6-2f2ba54791b3,network=Network(d7ca12ab-7c69-415e-be0d-20e16fbabaec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509a0271-f1')
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:26 compute-0 nova_compute[186241]: 2025-11-25 06:33:26.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:27 compute-0 nova_compute[186241]: 2025-11-25 06:33:27.124 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:27 compute-0 nova_compute[186241]: 2025-11-25 06:33:27.617 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:33:27 compute-0 nova_compute[186241]: 2025-11-25 06:33:27.618 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12811
Nov 25 06:33:27 compute-0 nova_compute[186241]: 2025-11-25 06:33:27.618 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] No VIF found with MAC fa:16:3e:7c:64:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12787
Nov 25 06:33:27 compute-0 nova_compute[186241]: 2025-11-25 06:33:27.618 186245 INFO nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Using config drive
Nov 25 06:33:28 compute-0 nova_compute[186241]: 2025-11-25 06:33:28.876 186245 INFO nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Creating config drive at /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk.config
Nov 25 06:33:28 compute-0 nova_compute[186241]: 2025-11-25 06:33:28.880 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp3uzshvtn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:28 compute-0 nova_compute[186241]: 2025-11-25 06:33:28.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:28 compute-0 nova_compute[186241]: 2025-11-25 06:33:28.997 186245 DEBUG oslo_concurrency.processutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 31.1.0-0.20250428102727.3e7017e.el9 -quiet -J -r -V config-2 /tmp/tmp3uzshvtn" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:29 compute-0 kernel: tap509a0271-f1: entered promiscuous mode
Nov 25 06:33:29 compute-0 NetworkManager[55345]: <info>  [1764052409.0337] manager: (tap509a0271-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.036 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 ovn_controller[95135]: 2025-11-25T06:33:29Z|00169|binding|INFO|Claiming lport 509a0271-f192-4363-a7a6-2f2ba54791b3 for this chassis.
Nov 25 06:33:29 compute-0 ovn_controller[95135]: 2025-11-25T06:33:29Z|00170|binding|INFO|509a0271-f192-4363-a7a6-2f2ba54791b3: Claiming fa:16:3e:7c:64:2e 10.100.0.8
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.038 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.040 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.053 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:64:2e 10.100.0.8'], port_security=['fa:16:3e:7c:64:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3e3567b4-364e-4663-82fb-6a7b8c296187', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '35b3e942-777b-4d1e-ab72-2cd5a20ca0aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71299229-251b-49dd-8c1a-540be467fcd4, chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=509a0271-f192-4363-a7a6-2f2ba54791b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.054 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 509a0271-f192-4363-a7a6-2f2ba54791b3 in datapath d7ca12ab-7c69-415e-be0d-20e16fbabaec bound to our chassis
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.055 103953 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ca12ab-7c69-415e-be0d-20e16fbabaec
Nov 25 06:33:29 compute-0 systemd-udevd[218219]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.063 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd2fbdd-1c15-42e6-a032-f75eb3218f84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.065 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7ca12ab-71 in ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:777
Nov 25 06:33:29 compute-0 NetworkManager[55345]: <info>  [1764052409.0704] device (tap509a0271-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 06:33:29 compute-0 NetworkManager[55345]: <info>  [1764052409.0712] device (tap509a0271-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.067 211354 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7ca12ab-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.067 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[779fa175-e713-4a3d-9f4d-794252c92863]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.072 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[201f847b-19b3-484f-ab1e-6e443c706cb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.083 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[39fc8e79-1449-4fb2-84fd-dc112f9e04e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 systemd-machined[152921]: New machine qemu-13-instance-0000000d.
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.097 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3b8738-26f8-44cb-85c4-8d261251360f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.100 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 ovn_controller[95135]: 2025-11-25T06:33:29Z|00171|binding|INFO|Setting lport 509a0271-f192-4363-a7a6-2f2ba54791b3 ovn-installed in OVS
Nov 25 06:33:29 compute-0 ovn_controller[95135]: 2025-11-25T06:33:29Z|00172|binding|INFO|Setting lport 509a0271-f192-4363-a7a6-2f2ba54791b3 up in Southbound
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.102 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.120 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[f276e697-630d-4344-ab6b-964e4ff00bee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 NetworkManager[55345]: <info>  [1764052409.1268] manager: (tapd7ca12ab-70): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.126 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[3f841967-ec26-4f24-97b6-0cebb5dd91a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.149 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[cb5c511e-0fb3-4b4e-9c78-613279e2f98d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.151 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[1b9d31a4-07a5-4a54-b4e6-4d77e6a7a944]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 NetworkManager[55345]: <info>  [1764052409.1638] device (tapd7ca12ab-70): carrier: link connected
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.166 211867 DEBUG oslo.privsep.daemon [-] privsep: reply[0667e4cd-4d21-4f17-be9f-5a58ad63b31c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.178 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[0cbe9ba1-8b2e-4e1f-9b66-75ea4fc0f491]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ca12ab-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:c9:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339756, 'reachable_time': 39819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218249, 'error': None, 'target': 'ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.189 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[fadd803e-a28f-447d-9cba-c610d90c6883]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe96:c9d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339756, 'tstamp': 339756}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218250, 'error': None, 'target': 'ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.200 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[175b1ccd-a2d4-4749-a018-074ed27c1c70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ca12ab-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:c9:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339756, 'reachable_time': 39819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218251, 'error': None, 'target': 'ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.220 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a398f1ad-cb71-42b9-bb9c-2e34ea1ff03b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.254 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a1ead61b-165d-4996-af71-8f478b9d2c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.254 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ca12ab-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.255 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.255 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ca12ab-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:29 compute-0 kernel: tapd7ca12ab-70: entered promiscuous mode
Nov 25 06:33:29 compute-0 NetworkManager[55345]: <info>  [1764052409.2573] manager: (tapd7ca12ab-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.261 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.262 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ca12ab-70, col_values=(('external_ids', {'iface-id': '52ffae52-0232-48cd-ba01-ca45b9f24734'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:29 compute-0 ovn_controller[95135]: 2025-11-25T06:33:29Z|00173|binding|INFO|Releasing lport 52ffae52-0232-48cd-ba01-ca45b9f24734 from this chassis (sb_readonly=0)
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.269 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.273 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[015f05d0-32f4-460b-b809-ae077f484715]: (4, '') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.274 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.274 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.274 103953 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d7ca12ab-7c69-415e-be0d-20e16fbabaec disable /usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.274 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.274 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[d5eb0c62-2eb4-4d65-a3f2-80e8b7e5056a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.275 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.275 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.276 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd968b5-1726-461d-be41-ec97059df078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.276 103953 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: global
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     log         /dev/log local0 debug
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     log-tag     haproxy-metadata-proxy-d7ca12ab-7c69-415e-be0d-20e16fbabaec
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     user        root
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     group       root
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     maxconn     1024
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     pidfile     /var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     daemon
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: defaults
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     log global
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     mode http
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     option httplog
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     option dontlognull
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     option http-server-close
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     option forwardfor
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     retries                 3
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     timeout http-request    30s
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     timeout connect         30s
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     timeout client          32s
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     timeout server          32s
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     timeout http-keep-alive 30s
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: listen listener
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     bind 169.254.169.254:80
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:     http-request add-header X-OVN-Network-ID d7ca12ab-7c69-415e-be0d-20e16fbabaec
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 06:33:29 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:29.278 103953 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'env', 'PROCESS_TAG=haproxy-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7ca12ab-7c69-415e-be0d-20e16fbabaec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:85
Nov 25 06:33:29 compute-0 podman[218286]: 2025-11-25 06:33:29.565107449 +0000 UTC m=+0.029745404 container create 8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:33:29 compute-0 systemd[1]: Started libpod-conmon-8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed.scope.
Nov 25 06:33:29 compute-0 systemd[1]: Started libcrun container.
Nov 25 06:33:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6822d1d8c06b21ea07a0112f39e8788145fea23f5bcca54f067b6941af2d35be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 06:33:29 compute-0 podman[218286]: 2025-11-25 06:33:29.618245008 +0000 UTC m=+0.082882963 container init 8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 06:33:29 compute-0 podman[218286]: 2025-11-25 06:33:29.623423191 +0000 UTC m=+0.088061146 container start 8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:33:29 compute-0 podman[218286]: 2025-11-25 06:33:29.551490362 +0000 UTC m=+0.016128337 image pull 302e67947a4a54ae34d532738d707d3c1f32a3afdf13de9ce2901d0a2ebbc92e quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78
Nov 25 06:33:29 compute-0 neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec[218298]: [NOTICE]   (218302) : New worker (218304) forked
Nov 25 06:33:29 compute-0 neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec[218298]: [NOTICE]   (218302) : Loading success.
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.669 186245 DEBUG nova.compute.manager [req-c9c11985-164f-44b3-bb41-294b60476b09 req-6f78ff41-8716-47bc-8c70-a8a3dac5159a a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.669 186245 DEBUG oslo_concurrency.lockutils [req-c9c11985-164f-44b3-bb41-294b60476b09 req-6f78ff41-8716-47bc-8c70-a8a3dac5159a a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.670 186245 DEBUG oslo_concurrency.lockutils [req-c9c11985-164f-44b3-bb41-294b60476b09 req-6f78ff41-8716-47bc-8c70-a8a3dac5159a a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.670 186245 DEBUG oslo_concurrency.lockutils [req-c9c11985-164f-44b3-bb41-294b60476b09 req-6f78ff41-8716-47bc-8c70-a8a3dac5159a a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.670 186245 DEBUG nova.compute.manager [req-c9c11985-164f-44b3-bb41-294b60476b09 req-6f78ff41-8716-47bc-8c70-a8a3dac5159a a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Processing event network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11497
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.671 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:578
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.674 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4870
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.676 186245 INFO nova.virt.libvirt.driver [-] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Instance spawned successfully.
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.676 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1005
Nov 25 06:33:29 compute-0 nova_compute[186241]: 2025-11-25 06:33:29.928 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.183 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.184 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.184 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.185 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.185 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.185 186245 DEBUG nova.virt.libvirt.driver [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1034
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.691 186245 INFO nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Took 13.51 seconds to spawn the instance on the hypervisor.
Nov 25 06:33:30 compute-0 nova_compute[186241]: 2025-11-25 06:33:30.691 186245 DEBUG nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1797
Nov 25 06:33:31 compute-0 podman[218309]: 2025-11-25 06:33:31.069042426 +0000 UTC m=+0.046287989 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true)
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.088 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.203 186245 INFO nova.compute.manager [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Took 18.63 seconds to build instance.
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.706 186245 DEBUG oslo_concurrency.lockutils [None req-e34cce45-4990-43ed-8e6e-46a0a6f459ab 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.868 186245 DEBUG nova.compute.manager [req-fdbf9dd9-8d17-4b94-91d6-1f5878baedf0 req-43cadfd6-efea-4e8b-9194-aef004df9d8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.869 186245 DEBUG oslo_concurrency.lockutils [req-fdbf9dd9-8d17-4b94-91d6-1f5878baedf0 req-43cadfd6-efea-4e8b-9194-aef004df9d8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.869 186245 DEBUG oslo_concurrency.lockutils [req-fdbf9dd9-8d17-4b94-91d6-1f5878baedf0 req-43cadfd6-efea-4e8b-9194-aef004df9d8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.869 186245 DEBUG oslo_concurrency.lockutils [req-fdbf9dd9-8d17-4b94-91d6-1f5878baedf0 req-43cadfd6-efea-4e8b-9194-aef004df9d8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.869 186245 DEBUG nova.compute.manager [req-fdbf9dd9-8d17-4b94-91d6-1f5878baedf0 req-43cadfd6-efea-4e8b-9194-aef004df9d8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] No waiting events found dispatching network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.870 186245 WARNING nova.compute.manager [req-fdbf9dd9-8d17-4b94-91d6-1f5878baedf0 req-43cadfd6-efea-4e8b-9194-aef004df9d8c a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received unexpected event network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 for instance with vm_state active and task_state None.
Nov 25 06:33:31 compute-0 nova_compute[186241]: 2025-11-25 06:33:31.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:32 compute-0 nova_compute[186241]: 2025-11-25 06:33:32.126 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:32 compute-0 nova_compute[186241]: 2025-11-25 06:33:32.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:32 compute-0 nova_compute[186241]: 2025-11-25 06:33:32.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:33:32 compute-0 nova_compute[186241]: 2025-11-25 06:33:32.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:33 compute-0 podman[218327]: 2025-11-25 06:33:33.06695984 +0000 UTC m=+0.043194354 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:33:33 compute-0 nova_compute[186241]: 2025-11-25 06:33:33.440 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:33 compute-0 nova_compute[186241]: 2025-11-25 06:33:33.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:33 compute-0 nova_compute[186241]: 2025-11-25 06:33:33.441 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:33 compute-0 nova_compute[186241]: 2025-11-25 06:33:33.442 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.466 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.519 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.520 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.573 186245 DEBUG oslo_concurrency.processutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.766 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.767 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5630MB free_disk=73.01697158813477GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.768 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:34 compute-0 nova_compute[186241]: 2025-11-25 06:33:34.768 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:35.356 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:33:35 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:35.357 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:33:35 compute-0 nova_compute[186241]: 2025-11-25 06:33:35.356 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:35 compute-0 nova_compute[186241]: 2025-11-25 06:33:35.803 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Instance 3e3567b4-364e-4663-82fb-6a7b8c296187 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1737
Nov 25 06:33:35 compute-0 nova_compute[186241]: 2025-11-25 06:33:35.804 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:33:35 compute-0 nova_compute[186241]: 2025-11-25 06:33:35.804 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:33:35 compute-0 nova_compute[186241]: 2025-11-25 06:33:35.832 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:33:36 compute-0 nova_compute[186241]: 2025-11-25 06:33:36.089 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:36 compute-0 nova_compute[186241]: 2025-11-25 06:33:36.335 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:33:36 compute-0 nova_compute[186241]: 2025-11-25 06:33:36.840 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:33:36 compute-0 nova_compute[186241]: 2025-11-25 06:33:36.840 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:37 compute-0 nova_compute[186241]: 2025-11-25 06:33:37.127 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:37 compute-0 ovn_controller[95135]: 2025-11-25T06:33:37Z|00174|binding|INFO|Releasing lport 52ffae52-0232-48cd-ba01-ca45b9f24734 from this chassis (sb_readonly=0)
Nov 25 06:33:37 compute-0 NetworkManager[55345]: <info>  [1764052417.5120] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 25 06:33:37 compute-0 NetworkManager[55345]: <info>  [1764052417.5126] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 25 06:33:37 compute-0 nova_compute[186241]: 2025-11-25 06:33:37.521 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:37 compute-0 ovn_controller[95135]: 2025-11-25T06:33:37Z|00175|binding|INFO|Releasing lport 52ffae52-0232-48cd-ba01-ca45b9f24734 from this chassis (sb_readonly=0)
Nov 25 06:33:37 compute-0 nova_compute[186241]: 2025-11-25 06:33:37.545 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:37 compute-0 nova_compute[186241]: 2025-11-25 06:33:37.549 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:37 compute-0 nova_compute[186241]: 2025-11-25 06:33:37.836 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:37 compute-0 nova_compute[186241]: 2025-11-25 06:33:37.836 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:33:38 compute-0 nova_compute[186241]: 2025-11-25 06:33:38.356 186245 DEBUG nova.compute.manager [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-changed-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:38 compute-0 nova_compute[186241]: 2025-11-25 06:33:38.356 186245 DEBUG nova.compute.manager [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Refreshing instance network info cache due to event network-changed-509a0271-f192-4363-a7a6-2f2ba54791b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:33:38 compute-0 nova_compute[186241]: 2025-11-25 06:33:38.357 186245 DEBUG oslo_concurrency.lockutils [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:33:38 compute-0 nova_compute[186241]: 2025-11-25 06:33:38.357 186245 DEBUG oslo_concurrency.lockutils [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:33:38 compute-0 nova_compute[186241]: 2025-11-25 06:33:38.357 186245 DEBUG nova.network.neutron [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Refreshing network info cache for port 509a0271-f192-4363-a7a6-2f2ba54791b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:33:39 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:39.358 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:40 compute-0 ovn_controller[95135]: 2025-11-25T06:33:40Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:64:2e 10.100.0.8
Nov 25 06:33:40 compute-0 ovn_controller[95135]: 2025-11-25T06:33:40Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:64:2e 10.100.0.8
Nov 25 06:33:41 compute-0 nova_compute[186241]: 2025-11-25 06:33:41.091 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:41 compute-0 nova_compute[186241]: 2025-11-25 06:33:41.164 186245 DEBUG nova.network.neutron [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updated VIF entry in instance network info cache for port 509a0271-f192-4363-a7a6-2f2ba54791b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:33:41 compute-0 nova_compute[186241]: 2025-11-25 06:33:41.164 186245 DEBUG nova.network.neutron [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updating instance_info_cache with network_info: [{"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:33:41 compute-0 nova_compute[186241]: 2025-11-25 06:33:41.667 186245 DEBUG oslo_concurrency.lockutils [req-dc1a990a-d53a-4629-8dbf-e756c0249c35 req-588648f4-93fe-4ec2-a04e-32b08ae2e041 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:33:42 compute-0 nova_compute[186241]: 2025-11-25 06:33:42.127 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:43 compute-0 podman[218366]: 2025-11-25 06:33:43.079439401 +0000 UTC m=+0.059035919 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 25 06:33:46 compute-0 nova_compute[186241]: 2025-11-25 06:33:46.094 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:46 compute-0 nova_compute[186241]: 2025-11-25 06:33:46.631 186245 INFO nova.compute.manager [None req-b1722bcd-17e4-4fbb-9dc7-ca9b2847b9cb 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Get console output
Nov 25 06:33:46 compute-0 nova_compute[186241]: 2025-11-25 06:33:46.634 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:33:47 compute-0 podman[218390]: 2025-11-25 06:33:47.056977619 +0000 UTC m=+0.034034692 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:33:47 compute-0 podman[218389]: 2025-11-25 06:33:47.069989545 +0000 UTC m=+0.048838673 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2)
Nov 25 06:33:47 compute-0 nova_compute[186241]: 2025-11-25 06:33:47.129 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:47.813 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:47.813 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:47.814 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:48 compute-0 ovn_controller[95135]: 2025-11-25T06:33:48Z|00176|binding|INFO|Releasing lport 52ffae52-0232-48cd-ba01-ca45b9f24734 from this chassis (sb_readonly=0)
Nov 25 06:33:48 compute-0 nova_compute[186241]: 2025-11-25 06:33:48.258 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:48 compute-0 ovn_controller[95135]: 2025-11-25T06:33:48Z|00177|binding|INFO|Releasing lport 52ffae52-0232-48cd-ba01-ca45b9f24734 from this chassis (sb_readonly=0)
Nov 25 06:33:48 compute-0 nova_compute[186241]: 2025-11-25 06:33:48.312 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:49 compute-0 nova_compute[186241]: 2025-11-25 06:33:49.537 186245 INFO nova.compute.manager [None req-507fce69-966f-481d-aefc-064ef82b4444 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Get console output
Nov 25 06:33:49 compute-0 nova_compute[186241]: 2025-11-25 06:33:49.540 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:33:50 compute-0 NetworkManager[55345]: <info>  [1764052430.5335] manager: (patch-br-int-to-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Nov 25 06:33:50 compute-0 nova_compute[186241]: 2025-11-25 06:33:50.533 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:50 compute-0 NetworkManager[55345]: <info>  [1764052430.5341] manager: (patch-provnet-697b4bb5-2c45-4ca4-98b7-51a4d59a5582-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Nov 25 06:33:50 compute-0 ovn_controller[95135]: 2025-11-25T06:33:50Z|00178|binding|INFO|Releasing lport 52ffae52-0232-48cd-ba01-ca45b9f24734 from this chassis (sb_readonly=0)
Nov 25 06:33:50 compute-0 nova_compute[186241]: 2025-11-25 06:33:50.585 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:50 compute-0 nova_compute[186241]: 2025-11-25 06:33:50.590 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:51 compute-0 nova_compute[186241]: 2025-11-25 06:33:51.003 186245 INFO nova.compute.manager [None req-c66c5a37-a436-4758-9dfd-1355a200d640 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Get console output
Nov 25 06:33:51 compute-0 nova_compute[186241]: 2025-11-25 06:33:51.006 211770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 25 06:33:51 compute-0 podman[218431]: 2025-11-25 06:33:51.05992956 +0000 UTC m=+0.037029942 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:33:51 compute-0 nova_compute[186241]: 2025-11-25 06:33:51.095 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:52 compute-0 nova_compute[186241]: 2025-11-25 06:33:52.130 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:53 compute-0 nova_compute[186241]: 2025-11-25 06:33:53.590 186245 DEBUG nova.compute.manager [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-changed-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:53 compute-0 nova_compute[186241]: 2025-11-25 06:33:53.590 186245 DEBUG nova.compute.manager [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Refreshing instance network info cache due to event network-changed-509a0271-f192-4363-a7a6-2f2ba54791b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11742
Nov 25 06:33:53 compute-0 nova_compute[186241]: 2025-11-25 06:33:53.590 186245 DEBUG oslo_concurrency.lockutils [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 06:33:53 compute-0 nova_compute[186241]: 2025-11-25 06:33:53.591 186245 DEBUG oslo_concurrency.lockutils [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquired lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 06:33:53 compute-0 nova_compute[186241]: 2025-11-25 06:33:53.591 186245 DEBUG nova.network.neutron [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Refreshing network info cache for port 509a0271-f192-4363-a7a6-2f2ba54791b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2064
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.080 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.081 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.081 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.081 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.082 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.082 186245 INFO nova.compute.manager [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Terminating instance
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.587 186245 DEBUG nova.compute.manager [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3164
Nov 25 06:33:54 compute-0 kernel: tap509a0271-f1 (unregistering): left promiscuous mode
Nov 25 06:33:54 compute-0 NetworkManager[55345]: <info>  [1764052434.6094] device (tap509a0271-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.616 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:54 compute-0 ovn_controller[95135]: 2025-11-25T06:33:54Z|00179|binding|INFO|Releasing lport 509a0271-f192-4363-a7a6-2f2ba54791b3 from this chassis (sb_readonly=0)
Nov 25 06:33:54 compute-0 ovn_controller[95135]: 2025-11-25T06:33:54Z|00180|binding|INFO|Setting lport 509a0271-f192-4363-a7a6-2f2ba54791b3 down in Southbound
Nov 25 06:33:54 compute-0 ovn_controller[95135]: 2025-11-25T06:33:54Z|00181|binding|INFO|Removing iface tap509a0271-f1 ovn-installed in OVS
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.623 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:64:2e 10.100.0.8'], port_security=['fa:16:3e:7c:64:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3e3567b4-364e-4663-82fb-6a7b8c296187', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd90b557db9104ecfb816b1cdab8712bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '35b3e942-777b-4d1e-ab72-2cd5a20ca0aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71299229-251b-49dd-8c1a-540be467fcd4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>], logical_port=509a0271-f192-4363-a7a6-2f2ba54791b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6d693130d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.620 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.627 103953 INFO neutron.agent.ovn.metadata.agent [-] Port 509a0271-f192-4363-a7a6-2f2ba54791b3 in datapath d7ca12ab-7c69-415e-be0d-20e16fbabaec unbound from our chassis
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.628 103953 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7ca12ab-7c69-415e-be0d-20e16fbabaec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:739
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.628 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[4319cb3e-2546-4935-b624-f219b61e9941]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.629 103953 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec namespace which is not needed anymore
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.633 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:54 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 25 06:33:54 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 10.298s CPU time.
Nov 25 06:33:54 compute-0 systemd-machined[152921]: Machine qemu-13-instance-0000000d terminated.
Nov 25 06:33:54 compute-0 podman[218447]: 2025-11-25 06:33:54.677820068 +0000 UTC m=+0.053726439 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 25 06:33:54 compute-0 neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec[218298]: [NOTICE]   (218302) : haproxy version is 2.8.14-c23fe91
Nov 25 06:33:54 compute-0 neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec[218298]: [NOTICE]   (218302) : path to executable is /usr/sbin/haproxy
Nov 25 06:33:54 compute-0 neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec[218298]: [WARNING]  (218302) : Exiting Master process...
Nov 25 06:33:54 compute-0 podman[218486]: 2025-11-25 06:33:54.717474193 +0000 UTC m=+0.020312959 container kill 8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125)
Nov 25 06:33:54 compute-0 neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec[218298]: [ALERT]    (218302) : Current worker (218304) exited with code 143 (Terminated)
Nov 25 06:33:54 compute-0 neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec[218298]: [WARNING]  (218302) : All workers exited. Exiting... (0)
Nov 25 06:33:54 compute-0 systemd[1]: libpod-8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed.scope: Deactivated successfully.
Nov 25 06:33:54 compute-0 podman[218498]: 2025-11-25 06:33:54.749249417 +0000 UTC m=+0.017404830 container died 8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:33:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed-userdata-shm.mount: Deactivated successfully.
Nov 25 06:33:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-6822d1d8c06b21ea07a0112f39e8788145fea23f5bcca54f067b6941af2d35be-merged.mount: Deactivated successfully.
Nov 25 06:33:54 compute-0 podman[218498]: 2025-11-25 06:33:54.771441815 +0000 UTC m=+0.039597218 container cleanup 8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:33:54 compute-0 systemd[1]: libpod-conmon-8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed.scope: Deactivated successfully.
Nov 25 06:33:54 compute-0 podman[218500]: 2025-11-25 06:33:54.778910001 +0000 UTC m=+0.042230288 container remove 8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.782 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5fce6bc4-a2ff-4968-bee5-c8a25c08e202]: (4, ("Tue Nov 25 06:33:54 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec (8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed)\n8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed\nTue Nov 25 06:33:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec (8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed)\n8d0285e5ddeb22e71ae98e7fde665607bad1a7e0c83c60ca95588ec98ccdc1ed\n", '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.783 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[77a45af8-6423-4218-93c6-25554ff4a331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.783 103953 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ca12ab-7c69-415e-be0d-20e16fbabaec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:269
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.784 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d6f28b-dfa9-4a09-87f7-a16e3d5ccb58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.784 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ca12ab-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.785 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.798 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:54 compute-0 kernel: tapd7ca12ab-70: left promiscuous mode
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.803 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.806 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[6306a1bb-df7b-4f7d-8976-3aa1b25c9749]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.816 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[5f81e4d0-8616-4d8c-9fcb-f200ffc5b569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.817 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[dc56ddad-8798-43d5-b574-33fb3a8edbce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.826 186245 INFO nova.virt.libvirt.driver [-] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Instance destroyed successfully.
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.827 186245 DEBUG nova.objects.instance [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lazy-loading 'resources' on Instance uuid 3e3567b4-364e-4663-82fb-6a7b8c296187 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1141
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.828 211354 DEBUG oslo.privsep.daemon [-] privsep: reply[7a37ba57-dee4-4f8c-8561-2b4c70c5bfa8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339752, 'reachable_time': 34392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218538, 'error': None, 'target': 'ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 systemd[1]: run-netns-ovnmeta\x2dd7ca12ab\x2d7c69\x2d415e\x2dbe0d\x2d20e16fbabaec.mount: Deactivated successfully.
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.831 104066 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7ca12ab-7c69-415e-be0d-20e16fbabaec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 06:33:54 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:33:54.831 104066 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8a14cc-a9e2-4e72-badc-1878408a82a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:503
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.853 186245 DEBUG nova.compute.manager [req-b42854d9-2614-4bed-8f22-ae8e54e362de req-ed1afe8c-dcf2-488a-a51b-54acc88a4213 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-vif-unplugged-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.853 186245 DEBUG oslo_concurrency.lockutils [req-b42854d9-2614-4bed-8f22-ae8e54e362de req-ed1afe8c-dcf2-488a-a51b-54acc88a4213 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.853 186245 DEBUG oslo_concurrency.lockutils [req-b42854d9-2614-4bed-8f22-ae8e54e362de req-ed1afe8c-dcf2-488a-a51b-54acc88a4213 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.853 186245 DEBUG oslo_concurrency.lockutils [req-b42854d9-2614-4bed-8f22-ae8e54e362de req-ed1afe8c-dcf2-488a-a51b-54acc88a4213 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.854 186245 DEBUG nova.compute.manager [req-b42854d9-2614-4bed-8f22-ae8e54e362de req-ed1afe8c-dcf2-488a-a51b-54acc88a4213 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] No waiting events found dispatching network-vif-unplugged-509a0271-f192-4363-a7a6-2f2ba54791b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:33:54 compute-0 nova_compute[186241]: 2025-11-25 06:33:54.854 186245 DEBUG nova.compute.manager [req-b42854d9-2614-4bed-8f22-ae8e54e362de req-ed1afe8c-dcf2-488a-a51b-54acc88a4213 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-vif-unplugged-509a0271-f192-4363-a7a6-2f2ba54791b3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11515
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.331 186245 DEBUG nova.virt.libvirt.vif [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T06:33:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1555126729',display_name='tempest-TestNetworkBasicOps-server-1555126729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1555126729',id=13,image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMamTGNu+pgqfT14H5GiPNJTaBKE2C6EeXRosqXkYEtnp6ufmDyaZ6AGyCPn1Jitqb+VSd4DDMgvHLatkEVlQ0PiyP0uXnqra1nP9RkOaF8WOIzDqsi58jateJwjcAxfUA==',key_name='tempest-TestNetworkBasicOps-183630991',keypairs=<?>,launch_index=0,launched_at=2025-11-25T06:33:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d90b557db9104ecfb816b1cdab8712bd',ramdisk_id='',reservation_id='r-uprckrac',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5215c26e-be2f-40b4-ac47-476bfa3cf3f2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1672753768',owner_user_name='tempest-TestNetworkBasicOps-1672753768-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T06:33:30Z,user_data=None,user_id='66a05d0ca82146a5a458244c8e5364de',uuid=3e3567b4-364e-4663-82fb-6a7b8c296187,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.331 186245 DEBUG nova.network.os_vif_util [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converting VIF {"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.332 186245 DEBUG nova.network.os_vif_util [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:64:2e,bridge_name='br-int',has_traffic_filtering=True,id=509a0271-f192-4363-a7a6-2f2ba54791b3,network=Network(d7ca12ab-7c69-415e-be0d-20e16fbabaec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509a0271-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.332 186245 DEBUG os_vif [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:64:2e,bridge_name='br-int',has_traffic_filtering=True,id=509a0271-f192-4363-a7a6-2f2ba54791b3,network=Network(d7ca12ab-7c69-415e-be0d-20e16fbabaec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509a0271-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.333 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.333 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap509a0271-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.335 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.337 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.338 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.338 186245 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4edcee83-bc8f-4c7a-a56f-59946a9fb971) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.338 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.340 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.342 186245 INFO os_vif [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:64:2e,bridge_name='br-int',has_traffic_filtering=True,id=509a0271-f192-4363-a7a6-2f2ba54791b3,network=Network(d7ca12ab-7c69-415e-be0d-20e16fbabaec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap509a0271-f1')
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.342 186245 INFO nova.virt.libvirt.driver [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Deleting instance files /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187_del
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.343 186245 INFO nova.virt.libvirt.driver [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Deletion of /var/lib/nova/instances/3e3567b4-364e-4663-82fb-6a7b8c296187_del complete
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.851 186245 INFO nova.compute.manager [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.851 186245 DEBUG oslo.service.backend.eventlet.loopingcall [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/backend/eventlet/loopingcall.py:436
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.852 186245 DEBUG nova.compute.manager [-] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2296
Nov 25 06:33:55 compute-0 nova_compute[186241]: 2025-11-25 06:33:55.852 186245 DEBUG nova.network.neutron [-] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1860
Nov 25 06:33:56 compute-0 nova_compute[186241]: 2025-11-25 06:33:56.685 186245 DEBUG nova.network.neutron [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updated VIF entry in instance network info cache for port 509a0271-f192-4363-a7a6-2f2ba54791b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3539
Nov 25 06:33:56 compute-0 nova_compute[186241]: 2025-11-25 06:33:56.685 186245 DEBUG nova.network.neutron [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updating instance_info_cache with network_info: [{"id": "509a0271-f192-4363-a7a6-2f2ba54791b3", "address": "fa:16:3e:7c:64:2e", "network": {"id": "d7ca12ab-7c69-415e-be0d-20e16fbabaec", "bridge": "br-int", "label": "tempest-network-smoke--852150635", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d90b557db9104ecfb816b1cdab8712bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap509a0271-f1", "ovs_interfaceid": "509a0271-f192-4363-a7a6-2f2ba54791b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.101 186245 DEBUG nova.compute.manager [req-7f926ecd-37cb-457e-8596-f506a657c533 req-f4a485d7-040e-4603-9a69-44e4e7af5762 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-vif-deleted-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.101 186245 INFO nova.compute.manager [req-7f926ecd-37cb-457e-8596-f506a657c533 req-f4a485d7-040e-4603-9a69-44e4e7af5762 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Neutron deleted interface 509a0271-f192-4363-a7a6-2f2ba54791b3; detaching it from the instance and deleting it from the info cache
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.102 186245 DEBUG nova.network.neutron [req-7f926ecd-37cb-457e-8596-f506a657c533 req-f4a485d7-040e-4603-9a69-44e4e7af5762 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.133 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.141 186245 DEBUG nova.compute.manager [req-5e0af146-366c-44f1-b44b-85971f4e6708 req-f2779210-71de-43e6-a5b5-8e881185526e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received event network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11737
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.142 186245 DEBUG oslo_concurrency.lockutils [req-5e0af146-366c-44f1-b44b-85971f4e6708 req-f2779210-71de-43e6-a5b5-8e881185526e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Acquiring lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.142 186245 DEBUG oslo_concurrency.lockutils [req-5e0af146-366c-44f1-b44b-85971f4e6708 req-f2779210-71de-43e6-a5b5-8e881185526e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.142 186245 DEBUG oslo_concurrency.lockutils [req-5e0af146-366c-44f1-b44b-85971f4e6708 req-f2779210-71de-43e6-a5b5-8e881185526e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.142 186245 DEBUG nova.compute.manager [req-5e0af146-366c-44f1-b44b-85971f4e6708 req-f2779210-71de-43e6-a5b5-8e881185526e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] No waiting events found dispatching network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:321
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.142 186245 WARNING nova.compute.manager [req-5e0af146-366c-44f1-b44b-85971f4e6708 req-f2779210-71de-43e6-a5b5-8e881185526e a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Received unexpected event network-vif-plugged-509a0271-f192-4363-a7a6-2f2ba54791b3 for instance with vm_state active and task_state deleting.
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.188 186245 DEBUG oslo_concurrency.lockutils [req-6466bc55-aa7c-4682-9907-63c55829c14c req-307f28f4-6c87-4e90-a6ac-de7f16026744 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] Releasing lock "refresh_cache-3e3567b4-364e-4663-82fb-6a7b8c296187" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.351 186245 DEBUG nova.network.neutron [-] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.606 186245 DEBUG nova.compute.manager [req-7f926ecd-37cb-457e-8596-f506a657c533 req-f4a485d7-040e-4603-9a69-44e4e7af5762 a5b9a1c6b01f48609e4bba682f605454 56a5355592004ce8816503a3e613e2e8 - - default default] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Detach interface failed, port_id=509a0271-f192-4363-a7a6-2f2ba54791b3, reason: Instance 3e3567b4-364e-4663-82fb-6a7b8c296187 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11571
Nov 25 06:33:57 compute-0 nova_compute[186241]: 2025-11-25 06:33:57.856 186245 INFO nova.compute.manager [-] [instance: 3e3567b4-364e-4663-82fb-6a7b8c296187] Took 2.00 seconds to deallocate network for instance.
Nov 25 06:33:58 compute-0 nova_compute[186241]: 2025-11-25 06:33:58.362 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:33:58 compute-0 nova_compute[186241]: 2025-11-25 06:33:58.363 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:33:58 compute-0 nova_compute[186241]: 2025-11-25 06:33:58.416 186245 DEBUG nova.compute.provider_tree [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:33:58 compute-0 nova_compute[186241]: 2025-11-25 06:33:58.920 186245 DEBUG nova.scheduler.client.report [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:33:58 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 06:33:58 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 06:33:59 compute-0 nova_compute[186241]: 2025-11-25 06:33:59.427 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:33:59 compute-0 nova_compute[186241]: 2025-11-25 06:33:59.454 186245 INFO nova.scheduler.client.report [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Deleted allocations for instance 3e3567b4-364e-4663-82fb-6a7b8c296187
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:33:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:33:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:34:00 compute-0 nova_compute[186241]: 2025-11-25 06:34:00.339 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:00 compute-0 nova_compute[186241]: 2025-11-25 06:34:00.462 186245 DEBUG oslo_concurrency.lockutils [None req-197ec862-1e6f-400c-a9ce-d7cba051025e 66a05d0ca82146a5a458244c8e5364de d90b557db9104ecfb816b1cdab8712bd - - default default] Lock "3e3567b4-364e-4663-82fb-6a7b8c296187" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:34:02 compute-0 podman[218543]: 2025-11-25 06:34:02.068891194 +0000 UTC m=+0.042359932 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:34:02 compute-0 nova_compute[186241]: 2025-11-25 06:34:02.134 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:04 compute-0 podman[218560]: 2025-11-25 06:34:04.083907713 +0000 UTC m=+0.062347636 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 25 06:34:05 compute-0 nova_compute[186241]: 2025-11-25 06:34:05.341 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:05 compute-0 nova_compute[186241]: 2025-11-25 06:34:05.445 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:05 compute-0 nova_compute[186241]: 2025-11-25 06:34:05.520 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:07 compute-0 nova_compute[186241]: 2025-11-25 06:34:07.134 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:10 compute-0 nova_compute[186241]: 2025-11-25 06:34:10.342 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:12 compute-0 nova_compute[186241]: 2025-11-25 06:34:12.135 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:14 compute-0 podman[218582]: 2025-11-25 06:34:14.072574566 +0000 UTC m=+0.052788643 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 06:34:15 compute-0 nova_compute[186241]: 2025-11-25 06:34:15.344 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:17 compute-0 nova_compute[186241]: 2025-11-25 06:34:17.136 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:18 compute-0 podman[218607]: 2025-11-25 06:34:18.060993926 +0000 UTC m=+0.039571389 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 25 06:34:18 compute-0 podman[218608]: 2025-11-25 06:34:18.085976027 +0000 UTC m=+0.062743631 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:34:20 compute-0 nova_compute[186241]: 2025-11-25 06:34:20.346 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:22 compute-0 podman[218645]: 2025-11-25 06:34:22.055057847 +0000 UTC m=+0.035474606 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 06:34:22 compute-0 nova_compute[186241]: 2025-11-25 06:34:22.139 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:25 compute-0 podman[218661]: 2025-11-25 06:34:25.090889895 +0000 UTC m=+0.065754296 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 06:34:25 compute-0 nova_compute[186241]: 2025-11-25 06:34:25.347 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:27 compute-0 nova_compute[186241]: 2025-11-25 06:34:27.140 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:27 compute-0 nova_compute[186241]: 2025-11-25 06:34:27.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:27 compute-0 nova_compute[186241]: 2025-11-25 06:34:27.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:30 compute-0 nova_compute[186241]: 2025-11-25 06:34:30.348 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:30 compute-0 nova_compute[186241]: 2025-11-25 06:34:30.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:32 compute-0 nova_compute[186241]: 2025-11-25 06:34:32.141 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:33 compute-0 podman[218679]: 2025-11-25 06:34:33.059923265 +0000 UTC m=+0.039296300 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:34:33 compute-0 nova_compute[186241]: 2025-11-25 06:34:33.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:33 compute-0 nova_compute[186241]: 2025-11-25 06:34:33.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:33 compute-0 nova_compute[186241]: 2025-11-25 06:34:33.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:33 compute-0 nova_compute[186241]: 2025-11-25 06:34:33.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:34:33 compute-0 nova_compute[186241]: 2025-11-25 06:34:33.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.447 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.447 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.448 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.448 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:34:34 compute-0 podman[218697]: 2025-11-25 06:34:34.519104594 +0000 UTC m=+0.045594654 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.645 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.646 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5783MB free_disk=73.0179214477539GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.646 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:34:34 compute-0 nova_compute[186241]: 2025-11-25 06:34:34.646 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.349 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.683 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.683 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.701 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing inventories for resource provider b9b31722-b833-4ea1-a013-247935742e36 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.714 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating ProviderTree inventory for provider b9b31722-b833-4ea1-a013-247935742e36 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.715 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.724 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing aggregate associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.739 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing trait associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX512VPCLMULQDQ,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_ARCH_X86_64,HW_CPU_X86_AMD_SVM,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX512VAES,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX2,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Nov 25 06:34:35 compute-0 nova_compute[186241]: 2025-11-25 06:34:35.756 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:34:35 compute-0 ovn_controller[95135]: 2025-11-25T06:34:35Z|00182|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Nov 25 06:34:36 compute-0 nova_compute[186241]: 2025-11-25 06:34:36.259 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:34:36 compute-0 nova_compute[186241]: 2025-11-25 06:34:36.765 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:34:36 compute-0 nova_compute[186241]: 2025-11-25 06:34:36.766 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:34:37 compute-0 nova_compute[186241]: 2025-11-25 06:34:37.142 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:37 compute-0 nova_compute[186241]: 2025-11-25 06:34:37.761 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:34:40 compute-0 nova_compute[186241]: 2025-11-25 06:34:40.350 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:42 compute-0 nova_compute[186241]: 2025-11-25 06:34:42.144 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:45 compute-0 podman[218718]: 2025-11-25 06:34:45.073918959 +0000 UTC m=+0.053326076 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 06:34:45 compute-0 nova_compute[186241]: 2025-11-25 06:34:45.351 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:47 compute-0 nova_compute[186241]: 2025-11-25 06:34:47.144 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:34:47.837 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:34:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:34:47.837 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:34:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:34:47.838 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:34:49 compute-0 podman[218743]: 2025-11-25 06:34:49.064079378 +0000 UTC m=+0.043162189 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 06:34:49 compute-0 podman[218744]: 2025-11-25 06:34:49.081951906 +0000 UTC m=+0.059847159 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:34:49 compute-0 sshd-session[218781]: Accepted publickey for zuul from 192.168.122.10 port 43046 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:34:49 compute-0 systemd-logind[744]: New session 26 of user zuul.
Nov 25 06:34:49 compute-0 systemd[1]: Started Session 26 of User zuul.
Nov 25 06:34:49 compute-0 sshd-session[218781]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:34:49 compute-0 sudo[218785]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 25 06:34:49 compute-0 sudo[218785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:34:50 compute-0 nova_compute[186241]: 2025-11-25 06:34:50.352 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:52 compute-0 nova_compute[186241]: 2025-11-25 06:34:52.144 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:52 compute-0 podman[218919]: 2025-11-25 06:34:52.38010052 +0000 UTC m=+0.038350980 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 06:34:53 compute-0 ovs-vsctl[218961]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 25 06:34:53 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 218809 (sos)
Nov 25 06:34:53 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 25 06:34:53 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 25 06:34:53 compute-0 virtqemud[186538]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 25 06:34:53 compute-0 virtqemud[186538]: hostname: compute-0
Nov 25 06:34:53 compute-0 virtqemud[186538]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 25 06:34:53 compute-0 virtqemud[186538]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 25 06:34:53 compute-0 virtqemud[186538]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 25 06:34:54 compute-0 crontab[219353]: (root) LIST (root)
Nov 25 06:34:55 compute-0 nova_compute[186241]: 2025-11-25 06:34:55.353 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:34:56 compute-0 podman[219441]: 2025-11-25 06:34:56.079867641 +0000 UTC m=+0.055961645 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Nov 25 06:34:56 compute-0 systemd[1]: Starting Hostname Service...
Nov 25 06:34:56 compute-0 systemd[1]: Started Hostname Service.
Nov 25 06:34:57 compute-0 nova_compute[186241]: 2025-11-25 06:34:57.146 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:00 compute-0 nova_compute[186241]: 2025-11-25 06:35:00.354 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:02 compute-0 nova_compute[186241]: 2025-11-25 06:35:02.146 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:02 compute-0 ovs-appctl[220870]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 25 06:35:02 compute-0 ovs-appctl[220882]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 25 06:35:02 compute-0 ovs-appctl[220888]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 25 06:35:04 compute-0 podman[221570]: 2025-11-25 06:35:04.071920961 +0000 UTC m=+0.046114323 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:35:04 compute-0 podman[221724]: 2025-11-25 06:35:04.874878619 +0000 UTC m=+0.072255417 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 25 06:35:05 compute-0 nova_compute[186241]: 2025-11-25 06:35:05.355 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:07 compute-0 nova_compute[186241]: 2025-11-25 06:35:07.146 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:07 compute-0 virtqemud[186538]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 25 06:35:09 compute-0 systemd[1]: Starting Time & Date Service...
Nov 25 06:35:09 compute-0 systemd[1]: Started Time & Date Service.
Nov 25 06:35:10 compute-0 nova_compute[186241]: 2025-11-25 06:35:10.356 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:12 compute-0 nova_compute[186241]: 2025-11-25 06:35:12.147 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:15 compute-0 podman[222361]: 2025-11-25 06:35:15.167325598 +0000 UTC m=+0.055745266 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 25 06:35:15 compute-0 nova_compute[186241]: 2025-11-25 06:35:15.358 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:17 compute-0 nova_compute[186241]: 2025-11-25 06:35:17.149 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:19 compute-0 podman[222385]: 2025-11-25 06:35:19.388972168 +0000 UTC m=+0.049918393 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:35:19 compute-0 podman[222384]: 2025-11-25 06:35:19.408970723 +0000 UTC m=+0.071683609 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 25 06:35:20 compute-0 nova_compute[186241]: 2025-11-25 06:35:20.359 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:22 compute-0 nova_compute[186241]: 2025-11-25 06:35:22.149 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:22 compute-0 podman[222424]: 2025-11-25 06:35:22.926909814 +0000 UTC m=+0.037844054 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 25 06:35:22 compute-0 nova_compute[186241]: 2025-11-25 06:35:22.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:22 compute-0 nova_compute[186241]: 2025-11-25 06:35:22.931 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Nov 25 06:35:25 compute-0 nova_compute[186241]: 2025-11-25 06:35:25.360 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:25 compute-0 sudo[218785]: pam_unix(sudo:session): session closed for user root
Nov 25 06:35:25 compute-0 sshd-session[218784]: Received disconnect from 192.168.122.10 port 43046:11: disconnected by user
Nov 25 06:35:25 compute-0 sshd-session[218784]: Disconnected from user zuul 192.168.122.10 port 43046
Nov 25 06:35:25 compute-0 sshd-session[218781]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:35:25 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 25 06:35:25 compute-0 systemd[1]: session-26.scope: Consumed 55.891s CPU time, 500.0M memory peak, read 91.7M from disk, written 36.9M to disk.
Nov 25 06:35:25 compute-0 systemd-logind[744]: Session 26 logged out. Waiting for processes to exit.
Nov 25 06:35:25 compute-0 systemd-logind[744]: Removed session 26.
Nov 25 06:35:25 compute-0 sshd-session[222440]: Accepted publickey for zuul from 192.168.122.10 port 42678 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:35:25 compute-0 systemd-logind[744]: New session 27 of user zuul.
Nov 25 06:35:25 compute-0 systemd[1]: Started Session 27 of User zuul.
Nov 25 06:35:25 compute-0 sshd-session[222440]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:35:26 compute-0 sudo[222444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-11-25-czoeayd.tar.xz
Nov 25 06:35:26 compute-0 sudo[222444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:35:26 compute-0 sudo[222444]: pam_unix(sudo:session): session closed for user root
Nov 25 06:35:26 compute-0 sshd-session[222443]: Received disconnect from 192.168.122.10 port 42678:11: disconnected by user
Nov 25 06:35:26 compute-0 sshd-session[222443]: Disconnected from user zuul 192.168.122.10 port 42678
Nov 25 06:35:26 compute-0 sshd-session[222440]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:35:26 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Nov 25 06:35:26 compute-0 systemd-logind[744]: Session 27 logged out. Waiting for processes to exit.
Nov 25 06:35:26 compute-0 systemd-logind[744]: Removed session 27.
Nov 25 06:35:26 compute-0 sshd-session[222470]: Accepted publickey for zuul from 192.168.122.10 port 42690 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:35:26 compute-0 podman[222469]: 2025-11-25 06:35:26.174297678 +0000 UTC m=+0.068229749 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 06:35:26 compute-0 systemd-logind[744]: New session 28 of user zuul.
Nov 25 06:35:26 compute-0 systemd[1]: Started Session 28 of User zuul.
Nov 25 06:35:26 compute-0 sshd-session[222470]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:35:26 compute-0 sudo[222492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Nov 25 06:35:26 compute-0 sudo[222492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:35:26 compute-0 sudo[222492]: pam_unix(sudo:session): session closed for user root
Nov 25 06:35:26 compute-0 sshd-session[222491]: Received disconnect from 192.168.122.10 port 42690:11: disconnected by user
Nov 25 06:35:26 compute-0 sshd-session[222491]: Disconnected from user zuul 192.168.122.10 port 42690
Nov 25 06:35:26 compute-0 sshd-session[222470]: pam_unix(sshd:session): session closed for user zuul
Nov 25 06:35:26 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Nov 25 06:35:26 compute-0 systemd-logind[744]: Session 28 logged out. Waiting for processes to exit.
Nov 25 06:35:26 compute-0 systemd-logind[744]: Removed session 28.
Nov 25 06:35:27 compute-0 nova_compute[186241]: 2025-11-25 06:35:27.152 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:30 compute-0 nova_compute[186241]: 2025-11-25 06:35:30.361 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:30 compute-0 nova_compute[186241]: 2025-11-25 06:35:30.430 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:30 compute-0 nova_compute[186241]: 2025-11-25 06:35:30.935 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:30 compute-0 nova_compute[186241]: 2025-11-25 06:35:30.935 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:32 compute-0 nova_compute[186241]: 2025-11-25 06:35:32.154 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:32 compute-0 nova_compute[186241]: 2025-11-25 06:35:32.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:33 compute-0 nova_compute[186241]: 2025-11-25 06:35:33.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:34 compute-0 nova_compute[186241]: 2025-11-25 06:35:34.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:34 compute-0 nova_compute[186241]: 2025-11-25 06:35:34.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:35 compute-0 podman[222517]: 2025-11-25 06:35:35.063954436 +0000 UTC m=+0.039841449 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:35:35 compute-0 podman[222518]: 2025-11-25 06:35:35.065262191 +0000 UTC m=+0.040015967 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.362 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.444 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.445 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.622 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.622 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5739MB free_disk=73.0172348022461GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.622 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:35:35 compute-0 nova_compute[186241]: 2025-11-25 06:35:35.623 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:35:36 compute-0 nova_compute[186241]: 2025-11-25 06:35:36.663 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:35:36 compute-0 nova_compute[186241]: 2025-11-25 06:35:36.663 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:35:36 compute-0 nova_compute[186241]: 2025-11-25 06:35:36.681 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:35:37 compute-0 nova_compute[186241]: 2025-11-25 06:35:37.156 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:37 compute-0 nova_compute[186241]: 2025-11-25 06:35:37.185 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:35:37 compute-0 nova_compute[186241]: 2025-11-25 06:35:37.186 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:35:37 compute-0 nova_compute[186241]: 2025-11-25 06:35:37.186 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:35:37 compute-0 nova_compute[186241]: 2025-11-25 06:35:37.186 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:37 compute-0 nova_compute[186241]: 2025-11-25 06:35:37.187 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Nov 25 06:35:37 compute-0 nova_compute[186241]: 2025-11-25 06:35:37.690 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Nov 25 06:35:38 compute-0 nova_compute[186241]: 2025-11-25 06:35:38.689 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:38 compute-0 nova_compute[186241]: 2025-11-25 06:35:38.690 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:38 compute-0 nova_compute[186241]: 2025-11-25 06:35:38.690 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:35:39 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 06:35:39 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 25 06:35:40 compute-0 nova_compute[186241]: 2025-11-25 06:35:40.363 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:41 compute-0 nova_compute[186241]: 2025-11-25 06:35:41.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:35:42 compute-0 nova_compute[186241]: 2025-11-25 06:35:42.158 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:45 compute-0 nova_compute[186241]: 2025-11-25 06:35:45.364 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:46 compute-0 podman[222559]: 2025-11-25 06:35:46.075785725 +0000 UTC m=+0.055226108 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:35:47 compute-0 nova_compute[186241]: 2025-11-25 06:35:47.159 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:35:47.888 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:35:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:35:47.889 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:35:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:35:47.889 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:35:50 compute-0 podman[222584]: 2025-11-25 06:35:50.056932493 +0000 UTC m=+0.034593016 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:35:50 compute-0 podman[222583]: 2025-11-25 06:35:50.06079833 +0000 UTC m=+0.039910781 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 06:35:50 compute-0 nova_compute[186241]: 2025-11-25 06:35:50.365 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:52 compute-0 nova_compute[186241]: 2025-11-25 06:35:52.159 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:53 compute-0 podman[222621]: 2025-11-25 06:35:53.054519863 +0000 UTC m=+0.033183359 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 06:35:55 compute-0 nova_compute[186241]: 2025-11-25 06:35:55.366 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:57 compute-0 podman[222637]: 2025-11-25 06:35:57.057776357 +0000 UTC m=+0.036825013 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc.)
Nov 25 06:35:57 compute-0 nova_compute[186241]: 2025-11-25 06:35:57.161 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.551 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:35:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:35:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:36:00 compute-0 nova_compute[186241]: 2025-11-25 06:36:00.368 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:02 compute-0 nova_compute[186241]: 2025-11-25 06:36:02.162 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:05 compute-0 nova_compute[186241]: 2025-11-25 06:36:05.369 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:06 compute-0 podman[222656]: 2025-11-25 06:36:06.067895808 +0000 UTC m=+0.043150025 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:36:06 compute-0 podman[222655]: 2025-11-25 06:36:06.085248966 +0000 UTC m=+0.063438105 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 25 06:36:07 compute-0 nova_compute[186241]: 2025-11-25 06:36:07.164 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:10 compute-0 nova_compute[186241]: 2025-11-25 06:36:10.370 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:12 compute-0 nova_compute[186241]: 2025-11-25 06:36:12.165 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:15 compute-0 nova_compute[186241]: 2025-11-25 06:36:15.371 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:17 compute-0 podman[222694]: 2025-11-25 06:36:17.073036204 +0000 UTC m=+0.053096454 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 25 06:36:17 compute-0 nova_compute[186241]: 2025-11-25 06:36:17.166 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:20 compute-0 nova_compute[186241]: 2025-11-25 06:36:20.372 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:21 compute-0 podman[222718]: 2025-11-25 06:36:21.058568007 +0000 UTC m=+0.035550059 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:36:21 compute-0 podman[222717]: 2025-11-25 06:36:21.06110014 +0000 UTC m=+0.038896858 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:36:22 compute-0 nova_compute[186241]: 2025-11-25 06:36:22.167 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:22 compute-0 nova_compute[186241]: 2025-11-25 06:36:22.229 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:24 compute-0 podman[222755]: 2025-11-25 06:36:24.048344305 +0000 UTC m=+0.027588422 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 06:36:25 compute-0 nova_compute[186241]: 2025-11-25 06:36:25.373 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:27 compute-0 nova_compute[186241]: 2025-11-25 06:36:27.168 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:28 compute-0 podman[222771]: 2025-11-25 06:36:28.05603677 +0000 UTC m=+0.036409500 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 25 06:36:30 compute-0 nova_compute[186241]: 2025-11-25 06:36:30.375 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:30 compute-0 nova_compute[186241]: 2025-11-25 06:36:30.436 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:30 compute-0 nova_compute[186241]: 2025-11-25 06:36:30.436 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:32 compute-0 nova_compute[186241]: 2025-11-25 06:36:32.169 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:33 compute-0 nova_compute[186241]: 2025-11-25 06:36:33.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:33 compute-0 nova_compute[186241]: 2025-11-25 06:36:33.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:34 compute-0 nova_compute[186241]: 2025-11-25 06:36:34.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:35 compute-0 nova_compute[186241]: 2025-11-25 06:36:35.377 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:35 compute-0 nova_compute[186241]: 2025-11-25 06:36:35.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.480 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.480 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.480 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.480 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.644 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.645 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5782MB free_disk=73.01746368408203GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.645 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:36:36 compute-0 nova_compute[186241]: 2025-11-25 06:36:36.646 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:36:37 compute-0 podman[222791]: 2025-11-25 06:36:37.060909671 +0000 UTC m=+0.037658844 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:36:37 compute-0 podman[222790]: 2025-11-25 06:36:37.082075326 +0000 UTC m=+0.060767562 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:36:37 compute-0 nova_compute[186241]: 2025-11-25 06:36:37.171 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:37 compute-0 nova_compute[186241]: 2025-11-25 06:36:37.913 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:36:37 compute-0 nova_compute[186241]: 2025-11-25 06:36:37.914 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:36:38 compute-0 nova_compute[186241]: 2025-11-25 06:36:38.184 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:36:38 compute-0 nova_compute[186241]: 2025-11-25 06:36:38.688 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:36:38 compute-0 nova_compute[186241]: 2025-11-25 06:36:38.689 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:36:38 compute-0 nova_compute[186241]: 2025-11-25 06:36:38.689 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:36:40 compute-0 nova_compute[186241]: 2025-11-25 06:36:40.378 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:40 compute-0 nova_compute[186241]: 2025-11-25 06:36:40.686 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:40 compute-0 nova_compute[186241]: 2025-11-25 06:36:40.687 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:36:40 compute-0 nova_compute[186241]: 2025-11-25 06:36:40.687 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:36:42 compute-0 nova_compute[186241]: 2025-11-25 06:36:42.172 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:45 compute-0 nova_compute[186241]: 2025-11-25 06:36:45.379 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:47 compute-0 nova_compute[186241]: 2025-11-25 06:36:47.174 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:36:47.949 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:36:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:36:47.950 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:36:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:36:47.950 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:36:48 compute-0 podman[222830]: 2025-11-25 06:36:48.070480429 +0000 UTC m=+0.050713471 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:36:50 compute-0 nova_compute[186241]: 2025-11-25 06:36:50.380 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:52 compute-0 podman[222854]: 2025-11-25 06:36:52.061605181 +0000 UTC m=+0.038347907 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:36:52 compute-0 podman[222853]: 2025-11-25 06:36:52.061720519 +0000 UTC m=+0.040447145 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd)
Nov 25 06:36:52 compute-0 nova_compute[186241]: 2025-11-25 06:36:52.175 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:55 compute-0 podman[222891]: 2025-11-25 06:36:55.055919456 +0000 UTC m=+0.035664626 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 06:36:55 compute-0 nova_compute[186241]: 2025-11-25 06:36:55.381 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:57 compute-0 nova_compute[186241]: 2025-11-25 06:36:57.176 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:36:59 compute-0 podman[222907]: 2025-11-25 06:36:59.060062048 +0000 UTC m=+0.039940440 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter)
Nov 25 06:37:00 compute-0 nova_compute[186241]: 2025-11-25 06:37:00.382 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:02 compute-0 nova_compute[186241]: 2025-11-25 06:37:02.177 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:05 compute-0 nova_compute[186241]: 2025-11-25 06:37:05.384 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:07 compute-0 nova_compute[186241]: 2025-11-25 06:37:07.178 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:08 compute-0 podman[222926]: 2025-11-25 06:37:08.064923705 +0000 UTC m=+0.041407075 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 25 06:37:08 compute-0 podman[222925]: 2025-11-25 06:37:08.085937717 +0000 UTC m=+0.063768167 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:37:10 compute-0 nova_compute[186241]: 2025-11-25 06:37:10.385 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:12 compute-0 nova_compute[186241]: 2025-11-25 06:37:12.180 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:15 compute-0 nova_compute[186241]: 2025-11-25 06:37:15.386 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:17 compute-0 nova_compute[186241]: 2025-11-25 06:37:17.181 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:19 compute-0 podman[222965]: 2025-11-25 06:37:19.079101393 +0000 UTC m=+0.058288634 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:37:20 compute-0 nova_compute[186241]: 2025-11-25 06:37:20.388 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:22 compute-0 nova_compute[186241]: 2025-11-25 06:37:22.182 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:23 compute-0 podman[222988]: 2025-11-25 06:37:23.064320131 +0000 UTC m=+0.039270015 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:37:23 compute-0 podman[222989]: 2025-11-25 06:37:23.066877673 +0000 UTC m=+0.039592994 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:37:25 compute-0 nova_compute[186241]: 2025-11-25 06:37:25.389 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:26 compute-0 podman[223028]: 2025-11-25 06:37:26.05697589 +0000 UTC m=+0.036874028 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 25 06:37:27 compute-0 nova_compute[186241]: 2025-11-25 06:37:27.182 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:30 compute-0 podman[223046]: 2025-11-25 06:37:30.059025825 +0000 UTC m=+0.039372949 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64)
Nov 25 06:37:30 compute-0 nova_compute[186241]: 2025-11-25 06:37:30.390 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:30 compute-0 nova_compute[186241]: 2025-11-25 06:37:30.928 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:31 compute-0 nova_compute[186241]: 2025-11-25 06:37:31.434 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:31 compute-0 nova_compute[186241]: 2025-11-25 06:37:31.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:32 compute-0 nova_compute[186241]: 2025-11-25 06:37:32.184 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:33 compute-0 nova_compute[186241]: 2025-11-25 06:37:33.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:35 compute-0 nova_compute[186241]: 2025-11-25 06:37:35.391 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:35 compute-0 nova_compute[186241]: 2025-11-25 06:37:35.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:35 compute-0 nova_compute[186241]: 2025-11-25 06:37:35.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:36 compute-0 nova_compute[186241]: 2025-11-25 06:37:36.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.186 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.446 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.446 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.446 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.621 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.621 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5777MB free_disk=73.01773452758789GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.622 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:37:37 compute-0 nova_compute[186241]: 2025-11-25 06:37:37.622 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:37:38 compute-0 nova_compute[186241]: 2025-11-25 06:37:38.937 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:37:38 compute-0 nova_compute[186241]: 2025-11-25 06:37:38.938 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:37:38 compute-0 nova_compute[186241]: 2025-11-25 06:37:38.956 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:37:39 compute-0 podman[223065]: 2025-11-25 06:37:39.067934725 +0000 UTC m=+0.044598572 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:37:39 compute-0 podman[223064]: 2025-11-25 06:37:39.087997414 +0000 UTC m=+0.067310465 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:37:39 compute-0 nova_compute[186241]: 2025-11-25 06:37:39.469 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:37:39 compute-0 nova_compute[186241]: 2025-11-25 06:37:39.470 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:37:39 compute-0 nova_compute[186241]: 2025-11-25 06:37:39.470 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.849s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:37:40 compute-0 nova_compute[186241]: 2025-11-25 06:37:40.393 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:41 compute-0 nova_compute[186241]: 2025-11-25 06:37:41.470 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:41 compute-0 nova_compute[186241]: 2025-11-25 06:37:41.470 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:37:41 compute-0 nova_compute[186241]: 2025-11-25 06:37:41.471 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:37:42 compute-0 nova_compute[186241]: 2025-11-25 06:37:42.185 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:44 compute-0 nova_compute[186241]: 2025-11-25 06:37:44.454 186245 DEBUG oslo_concurrency.processutils [None req-25fb0b84-c8a9-4c08-ab75-0f59ad806e84 4e66869ee9ea42dcbe6b3cd42ccd5ef5 569b0ed2b3cc4372897b86d284219992 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:349
Nov 25 06:37:44 compute-0 nova_compute[186241]: 2025-11-25 06:37:44.466 186245 DEBUG oslo_concurrency.processutils [None req-25fb0b84-c8a9-4c08-ab75-0f59ad806e84 4e66869ee9ea42dcbe6b3cd42ccd5ef5 569b0ed2b3cc4372897b86d284219992 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:372
Nov 25 06:37:45 compute-0 nova_compute[186241]: 2025-11-25 06:37:45.394 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:47 compute-0 nova_compute[186241]: 2025-11-25 06:37:47.187 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:37:47.960 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:37:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:37:47.961 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:37:47 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:37:47.961 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:37:49 compute-0 nova_compute[186241]: 2025-11-25 06:37:49.719 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:49 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:37:49.719 103953 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '36:64:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:4f:2e:0d:4b:88'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 06:37:49 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:37:49.720 103953 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 06:37:50 compute-0 podman[223106]: 2025-11-25 06:37:50.07055527 +0000 UTC m=+0.047451521 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 06:37:50 compute-0 nova_compute[186241]: 2025-11-25 06:37:50.396 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:50 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:37:50.721 103953 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=afd6e104-36fa-47e5-ae59-019941e8d117, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 06:37:52 compute-0 nova_compute[186241]: 2025-11-25 06:37:52.188 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:54 compute-0 podman[223130]: 2025-11-25 06:37:54.057943309 +0000 UTC m=+0.035760008 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:37:54 compute-0 podman[223129]: 2025-11-25 06:37:54.061937761 +0000 UTC m=+0.041510940 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_id=multipathd)
Nov 25 06:37:55 compute-0 nova_compute[186241]: 2025-11-25 06:37:55.397 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:57 compute-0 podman[223167]: 2025-11-25 06:37:57.05586179 +0000 UTC m=+0.035580147 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:37:57 compute-0 nova_compute[186241]: 2025-11-25 06:37:57.190 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.552 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:37:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:37:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:38:00 compute-0 nova_compute[186241]: 2025-11-25 06:38:00.399 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:01 compute-0 podman[223183]: 2025-11-25 06:38:01.057928657 +0000 UTC m=+0.038770533 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, distribution-scope=public)
Nov 25 06:38:02 compute-0 nova_compute[186241]: 2025-11-25 06:38:02.190 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:05 compute-0 nova_compute[186241]: 2025-11-25 06:38:05.401 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:07 compute-0 nova_compute[186241]: 2025-11-25 06:38:07.192 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:10 compute-0 podman[223202]: 2025-11-25 06:38:10.060904819 +0000 UTC m=+0.038756396 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:38:10 compute-0 podman[223203]: 2025-11-25 06:38:10.064914429 +0000 UTC m=+0.041281478 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:38:10 compute-0 nova_compute[186241]: 2025-11-25 06:38:10.402 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:12 compute-0 nova_compute[186241]: 2025-11-25 06:38:12.194 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:15 compute-0 nova_compute[186241]: 2025-11-25 06:38:15.403 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:17 compute-0 nova_compute[186241]: 2025-11-25 06:38:17.196 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:20 compute-0 nova_compute[186241]: 2025-11-25 06:38:20.404 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:21 compute-0 podman[223241]: 2025-11-25 06:38:21.072320558 +0000 UTC m=+0.052740804 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 06:38:22 compute-0 nova_compute[186241]: 2025-11-25 06:38:22.198 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:25 compute-0 podman[223265]: 2025-11-25 06:38:25.055638333 +0000 UTC m=+0.032018793 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:38:25 compute-0 podman[223264]: 2025-11-25 06:38:25.063005775 +0000 UTC m=+0.041148587 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 25 06:38:25 compute-0 nova_compute[186241]: 2025-11-25 06:38:25.406 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:27 compute-0 nova_compute[186241]: 2025-11-25 06:38:27.201 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:28 compute-0 podman[223303]: 2025-11-25 06:38:28.05599913 +0000 UTC m=+0.035465693 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 06:38:30 compute-0 nova_compute[186241]: 2025-11-25 06:38:30.407 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:32 compute-0 podman[223319]: 2025-11-25 06:38:32.061877983 +0000 UTC m=+0.038157407 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 06:38:32 compute-0 nova_compute[186241]: 2025-11-25 06:38:32.202 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:32 compute-0 nova_compute[186241]: 2025-11-25 06:38:32.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:33 compute-0 nova_compute[186241]: 2025-11-25 06:38:33.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:35 compute-0 nova_compute[186241]: 2025-11-25 06:38:35.408 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:35 compute-0 nova_compute[186241]: 2025-11-25 06:38:35.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:36 compute-0 nova_compute[186241]: 2025-11-25 06:38:36.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:36 compute-0 nova_compute[186241]: 2025-11-25 06:38:36.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:37 compute-0 nova_compute[186241]: 2025-11-25 06:38:37.205 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:38 compute-0 nova_compute[186241]: 2025-11-25 06:38:38.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.446 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.446 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.446 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.617 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.618 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5776MB free_disk=73.01773452758789GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.618 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:38:39 compute-0 nova_compute[186241]: 2025-11-25 06:38:39.618 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:38:40 compute-0 nova_compute[186241]: 2025-11-25 06:38:40.409 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:40 compute-0 nova_compute[186241]: 2025-11-25 06:38:40.654 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:38:40 compute-0 nova_compute[186241]: 2025-11-25 06:38:40.655 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:38:40 compute-0 nova_compute[186241]: 2025-11-25 06:38:40.673 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:38:41 compute-0 podman[223337]: 2025-11-25 06:38:41.057911978 +0000 UTC m=+0.037114603 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:38:41 compute-0 podman[223338]: 2025-11-25 06:38:41.06794795 +0000 UTC m=+0.044686809 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 25 06:38:41 compute-0 nova_compute[186241]: 2025-11-25 06:38:41.176 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:38:41 compute-0 nova_compute[186241]: 2025-11-25 06:38:41.177 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:38:41 compute-0 nova_compute[186241]: 2025-11-25 06:38:41.177 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:38:42 compute-0 nova_compute[186241]: 2025-11-25 06:38:42.177 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:42 compute-0 nova_compute[186241]: 2025-11-25 06:38:42.178 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:38:42 compute-0 nova_compute[186241]: 2025-11-25 06:38:42.178 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:38:42 compute-0 nova_compute[186241]: 2025-11-25 06:38:42.206 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:45 compute-0 nova_compute[186241]: 2025-11-25 06:38:45.411 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:47 compute-0 nova_compute[186241]: 2025-11-25 06:38:47.208 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:38:48.011 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:38:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:38:48.011 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:38:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:38:48.011 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:38:50 compute-0 nova_compute[186241]: 2025-11-25 06:38:50.413 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:52 compute-0 podman[223377]: 2025-11-25 06:38:52.072991975 +0000 UTC m=+0.052184445 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 25 06:38:52 compute-0 nova_compute[186241]: 2025-11-25 06:38:52.209 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:55 compute-0 nova_compute[186241]: 2025-11-25 06:38:55.415 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:56 compute-0 podman[223401]: 2025-11-25 06:38:56.064165863 +0000 UTC m=+0.039628241 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Nov 25 06:38:56 compute-0 podman[223402]: 2025-11-25 06:38:56.064819135 +0000 UTC m=+0.039081220 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:38:57 compute-0 nova_compute[186241]: 2025-11-25 06:38:57.211 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:38:59 compute-0 podman[223441]: 2025-11-25 06:38:59.076914039 +0000 UTC m=+0.057594932 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 06:39:00 compute-0 nova_compute[186241]: 2025-11-25 06:39:00.416 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:02 compute-0 nova_compute[186241]: 2025-11-25 06:39:02.212 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:03 compute-0 podman[223457]: 2025-11-25 06:39:03.057874395 +0000 UTC m=+0.037266532 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Nov 25 06:39:05 compute-0 nova_compute[186241]: 2025-11-25 06:39:05.417 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:07 compute-0 nova_compute[186241]: 2025-11-25 06:39:07.214 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:10 compute-0 nova_compute[186241]: 2025-11-25 06:39:10.419 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:12 compute-0 podman[223476]: 2025-11-25 06:39:12.061818397 +0000 UTC m=+0.039111378 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:39:12 compute-0 podman[223477]: 2025-11-25 06:39:12.064883492 +0000 UTC m=+0.040748231 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Nov 25 06:39:12 compute-0 nova_compute[186241]: 2025-11-25 06:39:12.215 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:15 compute-0 nova_compute[186241]: 2025-11-25 06:39:15.420 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:17 compute-0 nova_compute[186241]: 2025-11-25 06:39:17.217 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:20 compute-0 nova_compute[186241]: 2025-11-25 06:39:20.422 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:22 compute-0 nova_compute[186241]: 2025-11-25 06:39:22.218 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:23 compute-0 podman[223514]: 2025-11-25 06:39:23.067933733 +0000 UTC m=+0.048446325 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:39:25 compute-0 nova_compute[186241]: 2025-11-25 06:39:25.423 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:27 compute-0 podman[223537]: 2025-11-25 06:39:27.060091696 +0000 UTC m=+0.038954703 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:39:27 compute-0 podman[223538]: 2025-11-25 06:39:27.073123189 +0000 UTC m=+0.049471937 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 06:39:27 compute-0 nova_compute[186241]: 2025-11-25 06:39:27.219 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:30 compute-0 podman[223575]: 2025-11-25 06:39:30.055928024 +0000 UTC m=+0.035083841 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 25 06:39:30 compute-0 nova_compute[186241]: 2025-11-25 06:39:30.424 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:30 compute-0 nova_compute[186241]: 2025-11-25 06:39:30.928 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:32 compute-0 nova_compute[186241]: 2025-11-25 06:39:32.220 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:33 compute-0 nova_compute[186241]: 2025-11-25 06:39:33.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:34 compute-0 podman[223591]: 2025-11-25 06:39:34.058058013 +0000 UTC m=+0.038337349 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 06:39:34 compute-0 nova_compute[186241]: 2025-11-25 06:39:34.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:35 compute-0 nova_compute[186241]: 2025-11-25 06:39:35.425 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:35 compute-0 nova_compute[186241]: 2025-11-25 06:39:35.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:37 compute-0 nova_compute[186241]: 2025-11-25 06:39:37.221 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:38 compute-0 nova_compute[186241]: 2025-11-25 06:39:38.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:38 compute-0 nova_compute[186241]: 2025-11-25 06:39:38.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:39 compute-0 nova_compute[186241]: 2025-11-25 06:39:39.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.426 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.445 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.622 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.622 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5779MB free_disk=73.01773452758789GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.623 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:39:40 compute-0 nova_compute[186241]: 2025-11-25 06:39:40.623 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.651 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.651 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.667 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing inventories for resource provider b9b31722-b833-4ea1-a013-247935742e36 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:822
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.681 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating ProviderTree inventory for provider b9b31722-b833-4ea1-a013-247935742e36 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:786
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.681 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Updating inventory in ProviderTree for provider b9b31722-b833-4ea1-a013-247935742e36 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.689 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing aggregate associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:831
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.703 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Refreshing trait associations for resource provider b9b31722-b833-4ea1-a013-247935742e36, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX512VPCLMULQDQ,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_ARCH_X86_64,HW_CPU_X86_AMD_SVM,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX512VAES,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX2,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:843
Nov 25 06:39:41 compute-0 nova_compute[186241]: 2025-11-25 06:39:41.717 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:39:42 compute-0 nova_compute[186241]: 2025-11-25 06:39:42.221 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:39:42 compute-0 nova_compute[186241]: 2025-11-25 06:39:42.222 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:39:42 compute-0 nova_compute[186241]: 2025-11-25 06:39:42.222 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:39:42 compute-0 nova_compute[186241]: 2025-11-25 06:39:42.223 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:43 compute-0 podman[223609]: 2025-11-25 06:39:43.060306101 +0000 UTC m=+0.038704091 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:39:43 compute-0 podman[223610]: 2025-11-25 06:39:43.064425511 +0000 UTC m=+0.041339666 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:39:44 compute-0 nova_compute[186241]: 2025-11-25 06:39:44.222 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:44 compute-0 nova_compute[186241]: 2025-11-25 06:39:44.222 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:39:44 compute-0 nova_compute[186241]: 2025-11-25 06:39:44.222 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:39:45 compute-0 nova_compute[186241]: 2025-11-25 06:39:45.427 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:47 compute-0 nova_compute[186241]: 2025-11-25 06:39:47.223 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:39:48.070 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:39:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:39:48.070 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:39:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:39:48.070 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:39:50 compute-0 nova_compute[186241]: 2025-11-25 06:39:50.428 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:52 compute-0 nova_compute[186241]: 2025-11-25 06:39:52.224 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:54 compute-0 podman[223648]: 2025-11-25 06:39:54.076923239 +0000 UTC m=+0.055756939 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 25 06:39:55 compute-0 nova_compute[186241]: 2025-11-25 06:39:55.429 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:57 compute-0 nova_compute[186241]: 2025-11-25 06:39:57.226 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:39:58 compute-0 podman[223671]: 2025-11-25 06:39:58.058927501 +0000 UTC m=+0.038024991 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 25 06:39:58 compute-0 podman[223672]: 2025-11-25 06:39:58.061139259 +0000 UTC m=+0.037526250 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.552 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7ff3800b2fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7ff3800c4a60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7ff3800b2c10>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7ff3800c4850>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7ff3800c4be0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7ff3800afd30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.553 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7ff3800c4250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7ff3800c41f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7ff3800ca550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7ff3800b2490>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7ff385f9eb80>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7ff3800c4880>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7ff3800b2610>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.554 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7ff3800c42e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7ff3800c4130>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7ff3800b2550>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7ff3800b2f70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7ff3800c4670>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7ff3800c4b20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7ff3800c4400>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7ff3800c45b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.555 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7ff3800b2040>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7ff3800b2280>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7ff3800b2250>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7ff3800c4fa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7ff3800b2b50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7ff381106670>>]. poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:212
Nov 25 06:39:59 compute-0 ceilometer_agent_compute[196902]: 2025-11-25 06:39:59.556 16 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:236
Nov 25 06:40:00 compute-0 nova_compute[186241]: 2025-11-25 06:40:00.430 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:01 compute-0 podman[223709]: 2025-11-25 06:40:01.058858744 +0000 UTC m=+0.034250390 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true)
Nov 25 06:40:02 compute-0 nova_compute[186241]: 2025-11-25 06:40:02.227 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:05 compute-0 podman[223725]: 2025-11-25 06:40:05.057865198 +0000 UTC m=+0.037917438 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 06:40:05 compute-0 nova_compute[186241]: 2025-11-25 06:40:05.432 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:07 compute-0 nova_compute[186241]: 2025-11-25 06:40:07.228 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:10 compute-0 nova_compute[186241]: 2025-11-25 06:40:10.433 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:12 compute-0 nova_compute[186241]: 2025-11-25 06:40:12.229 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:14 compute-0 podman[223744]: 2025-11-25 06:40:14.05643807 +0000 UTC m=+0.031483066 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 25 06:40:14 compute-0 podman[223745]: 2025-11-25 06:40:14.066902936 +0000 UTC m=+0.040025229 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 25 06:40:15 compute-0 nova_compute[186241]: 2025-11-25 06:40:15.434 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:17 compute-0 nova_compute[186241]: 2025-11-25 06:40:17.231 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:20 compute-0 nova_compute[186241]: 2025-11-25 06:40:20.436 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:22 compute-0 nova_compute[186241]: 2025-11-25 06:40:22.233 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:23 compute-0 nova_compute[186241]: 2025-11-25 06:40:23.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:23 compute-0 nova_compute[186241]: 2025-11-25 06:40:23.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11872
Nov 25 06:40:25 compute-0 podman[223782]: 2025-11-25 06:40:25.077034639 +0000 UTC m=+0.057432275 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:40:25 compute-0 nova_compute[186241]: 2025-11-25 06:40:25.437 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:27 compute-0 nova_compute[186241]: 2025-11-25 06:40:27.234 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:29 compute-0 podman[223805]: 2025-11-25 06:40:29.065974719 +0000 UTC m=+0.040916920 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:40:29 compute-0 podman[223806]: 2025-11-25 06:40:29.068129388 +0000 UTC m=+0.041176187 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 06:40:30 compute-0 nova_compute[186241]: 2025-11-25 06:40:30.439 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:32 compute-0 podman[223844]: 2025-11-25 06:40:32.055873308 +0000 UTC m=+0.034631307 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 06:40:32 compute-0 nova_compute[186241]: 2025-11-25 06:40:32.235 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:34 compute-0 nova_compute[186241]: 2025-11-25 06:40:34.435 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:35 compute-0 nova_compute[186241]: 2025-11-25 06:40:35.441 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:35 compute-0 nova_compute[186241]: 2025-11-25 06:40:35.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:36 compute-0 podman[223860]: 2025-11-25 06:40:36.061945545 +0000 UTC m=+0.040653854 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Nov 25 06:40:36 compute-0 nova_compute[186241]: 2025-11-25 06:40:36.931 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:37 compute-0 nova_compute[186241]: 2025-11-25 06:40:37.237 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:38 compute-0 nova_compute[186241]: 2025-11-25 06:40:38.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:39 compute-0 nova_compute[186241]: 2025-11-25 06:40:39.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.443 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.445 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.445 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:937
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.619 186245 WARNING nova.virt.libvirt.driver [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.620 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5767MB free_disk=73.01773452758789GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.620 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:40:40 compute-0 nova_compute[186241]: 2025-11-25 06:40:40.620 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:40:41 compute-0 nova_compute[186241]: 2025-11-25 06:40:41.653 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 06:40:41 compute-0 nova_compute[186241]: 2025-11-25 06:40:41.653 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 06:40:41 compute-0 nova_compute[186241]: 2025-11-25 06:40:41.671 186245 DEBUG nova.compute.provider_tree [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed in ProviderTree for provider: b9b31722-b833-4ea1-a013-247935742e36 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 25 06:40:42 compute-0 nova_compute[186241]: 2025-11-25 06:40:42.174 186245 DEBUG nova.scheduler.client.report [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Inventory has not changed for provider b9b31722-b833-4ea1-a013-247935742e36 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:958
Nov 25 06:40:42 compute-0 nova_compute[186241]: 2025-11-25 06:40:42.175 186245 DEBUG nova.compute.resource_tracker [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 06:40:42 compute-0 nova_compute[186241]: 2025-11-25 06:40:42.175 186245 DEBUG oslo_concurrency.lockutils [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:40:42 compute-0 nova_compute[186241]: 2025-11-25 06:40:42.238 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:43 compute-0 nova_compute[186241]: 2025-11-25 06:40:43.174 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:43 compute-0 nova_compute[186241]: 2025-11-25 06:40:43.927 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:43 compute-0 nova_compute[186241]: 2025-11-25 06:40:43.930 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:43 compute-0 nova_compute[186241]: 2025-11-25 06:40:43.931 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11153
Nov 25 06:40:44 compute-0 nova_compute[186241]: 2025-11-25 06:40:44.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:44 compute-0 nova_compute[186241]: 2025-11-25 06:40:44.932 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11834
Nov 25 06:40:45 compute-0 podman[223878]: 2025-11-25 06:40:45.06297145 +0000 UTC m=+0.037882593 container health_status 0ffaf51a995c3e7fc30bad18d3a05a321f8c2863aea8a378d811f5dec4be50a3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 25 06:40:45 compute-0 podman[223879]: 2025-11-25 06:40:45.070478423 +0000 UTC m=+0.043313234 container health_status d6aae1cf845485a1535e04b75e183814bc8371bc7c08ec487cf8ea4a44f2f544 (image=quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ceilometer-compute:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 06:40:45 compute-0 nova_compute[186241]: 2025-11-25 06:40:45.436 186245 DEBUG nova.compute.manager [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11843
Nov 25 06:40:45 compute-0 nova_compute[186241]: 2025-11-25 06:40:45.445 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:45 compute-0 nova_compute[186241]: 2025-11-25 06:40:45.932 186245 DEBUG oslo_service.periodic_task [None req-691b029b-64d1-4420-a12f-36ae7026c065 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 25 06:40:47 compute-0 nova_compute[186241]: 2025-11-25 06:40:47.240 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:40:48.084 103953 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 06:40:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:40:48.085 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 06:40:48 compute-0 ovn_metadata_agent[103948]: 2025-11-25 06:40:48.085 103953 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 06:40:50 compute-0 nova_compute[186241]: 2025-11-25 06:40:50.446 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:52 compute-0 nova_compute[186241]: 2025-11-25 06:40:52.242 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:55 compute-0 nova_compute[186241]: 2025-11-25 06:40:55.447 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:40:56 compute-0 podman[223917]: 2025-11-25 06:40:56.078908409 +0000 UTC m=+0.057364667 container health_status f95f41c5ef290dbcb02759d566fcf019d200c74da8b6efd896d23da81cd3ef54 (image=quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-ovn-controller:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 06:40:57 compute-0 nova_compute[186241]: 2025-11-25 06:40:57.243 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:41:00 compute-0 podman[223941]: 2025-11-25 06:41:00.066171199 +0000 UTC m=+0.044108764 container health_status 36dcb3b36b8208c3fd97ab7e39e75ec826bacfef3212276a346235e071b879fd (image=quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-multipathd:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78)
Nov 25 06:41:00 compute-0 podman[223942]: 2025-11-25 06:41:00.066223026 +0000 UTC m=+0.042226798 container health_status 834dd60ccb0f60ef92130c06da115a2fb50eacd60f5da0e1e4e6a85899650226 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 06:41:00 compute-0 nova_compute[186241]: 2025-11-25 06:41:00.448 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:41:00 compute-0 sshd-session[223979]: Accepted publickey for zuul from 192.168.122.10 port 41032 ssh2: ECDSA SHA256:HayXYD5ZFVP9tWCrzNiutOXnTkMlztMVigeL/tDds08
Nov 25 06:41:00 compute-0 systemd-logind[744]: New session 29 of user zuul.
Nov 25 06:41:00 compute-0 systemd[1]: Started Session 29 of User zuul.
Nov 25 06:41:00 compute-0 sshd-session[223979]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 06:41:00 compute-0 sudo[223983]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 25 06:41:00 compute-0 sudo[223983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 06:41:02 compute-0 nova_compute[186241]: 2025-11-25 06:41:02.243 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:41:02 compute-0 podman[224111]: 2025-11-25 06:41:02.485245965 +0000 UTC m=+0.040830036 container health_status 62a0f124f0e7e4686df203b3908382d75b3f00c6a3f80447a1450a3b6e00cf62 (image=quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos9/openstack-neutron-metadata-agent-ovn:2cf1dc4bf18c6d57e990b3cd04e8ec78', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=2cf1dc4bf18c6d57e990b3cd04e8ec78, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 06:41:04 compute-0 ovs-vsctl[224159]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 25 06:41:04 compute-0 virtqemud[186538]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 25 06:41:04 compute-0 virtqemud[186538]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 25 06:41:04 compute-0 virtqemud[186538]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 25 06:41:05 compute-0 nova_compute[186241]: 2025-11-25 06:41:05.449 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:41:05 compute-0 crontab[224536]: (root) LIST (root)
Nov 25 06:41:07 compute-0 podman[224633]: 2025-11-25 06:41:07.121904851 +0000 UTC m=+0.090300791 container health_status f3070edeeb1807616794a526b4ec80487f3398b74aa02e41582b1e0fc6328d6d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, version=9.6, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter)
Nov 25 06:41:07 compute-0 nova_compute[186241]: 2025-11-25 06:41:07.245 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 25 06:41:07 compute-0 systemd[1]: Starting Hostname Service...
Nov 25 06:41:07 compute-0 systemd[1]: Started Hostname Service.
Nov 25 06:41:10 compute-0 nova_compute[186241]: 2025-11-25 06:41:10.451 186245 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
